plugins icon indicating copy to clipboard operation
plugins copied to clipboard

[video_player]fix ios 16 bug where encrypted video stream is not showing

Open hellohuanlin opened this issue 3 years ago • 5 comments
trafficstars

The issue

copyPixelBufferForItemTime always return nil (hasNewPixelBufferForItemTime returning false) on iOS 16 for encrypted video stream.

Research

There aren't too much info online. It seems to be by design that only Apple's AVPlayerLayer can access those encrypted video stream on iOS 16.

For example, this unresolved issue (5 years ago, but could be the same issue under different format):

the only way to display your protected video content is by using an AVPlayerLayer

Another hint is that iOS 16 introduced a new API AVPlayerLayer::copyDisplayedPixelBuffer with a note:

It also returns nil when displaying protected content

Reverse engineer

I almost concluded that there's no way to solve it. But i was wondering "how did Apple solve it in AVPlayerLayer"? It must have accessed some private APIs. Something like this:

    if player.videoStream.someMetaData.isProtected {
      player.videoStream._unlockProtectedContent() // a hypothetical private API
    }

If I were to design AVFoundation, I would probably make it a layered structure, so that the "unlocked" pixel buffers can be piped downstream. This mean that "unlocking" probably happens towards the upstream, hence all outputs (including ours!) can be granted the access. Something like this:

graph TD;
  A[upstream input]-->B[...];
  B-->C[_unlock, hypothetical private API used by AVPlayerLayer];
  C-->D[...];
  D-->E[...]
  E-->F[output 1];
  E-->G[output 2];
  E-->H[output used by AVPlayerLayer];
  E-->I[output used by us];

To verify this hypothesis, I created a dummy AVPlayerLayer (without adding it to screen), hoping that the hypothetical unlocking happens in the constructor. The result is that only first video frame is displayed, and all other frames are still not accessible.

This tells 3 things:

  1. The above layered structure is likely what Apple is doing
  2. The hypothetical unlocking did happen in the constructor, but only for the first frame.
  3. The "unlocking" process is likely for each specific frame, and not the whole stream

Now based on all the above info, this is what I guess Apple's implementation of AVPlayerLayer is:

class AVPlayerLayer { 
  // `init` only unlocks the first frame 
  init(player: AVPlayer) {
    player.videoStream.firstFrame()._unlockFrame() // hypothetical private API
  }

  // each subsequent frame is unlocked separately
  func renderNextFrame() { 
    player.videoStream.frameAtCurrentTime()._unlockFrame() // hypothetical private API

    // now these pixel buffers become accessible
    let pixelBuffer = ... 
    paintOnCanvas(pixelBuffer)
  }
}

So in conclusion, all we need to do is to have a dummy AVPlayerLayer to concurrently play the video, which should invoke this hypothetical renderNextFrame() function, and eventually unlock every single frame for us.

And it worked :)

Future

It is pretty obvious that Apple's intention is to only allow AVPlayerLayer to access those pixel buffers from iOS 16 onwards.

To follow this spirit, we should probably rely on platform views with AVPlayerLayer attached. This would be a huge change from the current implementation though (CC: @stuartmorgan)

Other notes

Can we only enable this workaround only for encrypted files

It is hard to tell if a video stream is encrypted or not. For example, the second answer in this question.

Can we only enable this for m3u8 files?

File extensions may not be reliable, and it is not realistic to dig into the m3u8 specs and inspect the metadata.

Performance impact?

Very minimal impact since the layer is not rendered on screen. Screenshots here.

About tests

The current unit test isn't the best since it validates implementation instead of behavior. A better unit test is to validate that copyPixelBuffer actually returns the frames. But this requires setting up the video player properly with encrypted video file and then play the video. Unfortunately I am not too familiar with how this plugin works and how apple's API works yet.

List which issues are fixed by this PR. You must list at least one issue.

Fixes https://github.com/flutter/flutter/issues/111457

If you had to change anything in the flutter/tests repo, include a link to the migration guide as per the breaking change policy.

Pre-launch Checklist

  • [x] I read the Contributor Guide and followed the process outlined there for submitting PRs.
  • [x] I read the Tree Hygiene wiki page, which explains my responsibilities.
  • [x] I read and followed the relevant style guides and ran the auto-formatter. (Unlike the flutter/flutter repo, the flutter/plugins repo does use dart format.)
  • [x] I signed the CLA.
  • [x] The title of the PR starts with the name of the plugin surrounded by square brackets, e.g. [shared_preferences]
  • [x] I listed at least one issue that this PR fixes in the description above.
  • [x] I updated pubspec.yaml with an appropriate new version according to the pub versioning philosophy, or this PR is exempt from version changes.
  • [x] I updated CHANGELOG.md to add a description of the change, following repository CHANGELOG style.
  • [x] I updated/added relevant documentation (doc comments with ///).
  • [x] I added new tests to check the change I am making, or this PR is test-exempt.
  • [x] All existing and new tests are passing.

If you need help, consider asking for advice on the #hackers-new channel on Discord.

hellohuanlin avatar Sep 16 '22 19:09 hellohuanlin

This seems really fragile.

To follow this spirit, we should probably rely on platform views with AVPlayerLayer attached. This would be a huge change from the current implementation though (CC: @stuartmorgan)

In terms of the actual Flutter behavior, yes, but based on the diff in https://github.com/flutter/flutter/issues/86613 it seems like it might not actually be a huge change?

As discussed there, this is something we've talked about doing in the past. Recent discussions around platform views vs textures (in the context of Android, but they apply more generally) give me pause on actually switching implementation outright, but I think a better option here would probably be:

  • Implement a platform-view-based path on iOS.
  • Add a client-level option to video_player for whether to use textures or platform views (documenting that whether it's supported is platform-specific).
  • Document some of the tradeoffs in the README and let people experiment with it for a while.

Once we have some real-world feedback (e.g., comparing issues reported in both versions) we can decide if we want to switch entirely (and potentially spin out the old implementation as a community-supported unendorsed implementation that people could choose to use).

Until recently I was against maintaining two versions, but my recent experience makes me much more inclined to a gradual, opt-in roll-out.

stuartmorgan-g avatar Sep 19 '22 15:09 stuartmorgan-g

introspecting the file uniform type identifier,

@jmagman which API are you referring to? I found this NSURL API but it's only for local resources.

[url getResourceValue:&type forKey: NSURLTypeIdentifierKey error:&error]

hellohuanlin avatar Sep 19 '22 18:09 hellohuanlin

@stuartmorgan thanks for the insight. I think the "opt-in roll-out" strategy is a good idea. It's lucky that we are able to reverse engineer AVFoundation and spot this "loophole", which could be fixed by Apple in the future. When that happens, I don't think it will be possible to access those pixel buffers of protected contents anymore.

hellohuanlin avatar Sep 19 '22 18:09 hellohuanlin

introspecting the file uniform type identifier,

@jmagman which API are you referring to? I found this NSURL API but it's only for local resources.

[url getResourceValue:&type forKey: NSURLTypeIdentifierKey error:&error]

That gets the extended attribute Finder puts on it, which works for local URLs. If you can get the mime type from the NSData then you should be able to get the UTI from that?

jmagman avatar Sep 20 '22 20:09 jmagman

As discussed offline I think we will skip the file type check. I am still looking into integration tests as suggested.

hellohuanlin avatar Sep 20 '22 23:09 hellohuanlin

Was there any way to add an integration test for this? Could you programmatically tell that video was not playing?

@jmagman I looked into this for a few days, but did not find anything about validating if video is playing.

So I came up with this workaround to sample 30 screenshots. Having at least 3 distinct screenshots should prove that the video is playing (1 for loading, and 2 for distinct video frames).

The original code fails the test and new code passes it.

hellohuanlin avatar Sep 22 '22 18:09 hellohuanlin

 error: -[VideoPlayerUITests testEncryptedVideoStream] : ((frames.count >= 3) is true) failed - Must have at least 3 distinct frames.

This failed in CI.

Hmmm, could be network timeout. Let me try adding a delay.

hellohuanlin avatar Sep 22 '22 18:09 hellohuanlin

auto label is removed for flutter/plugins, pr: 6442, due to - The status or check suite ios-platform_tests CHANNEL:master PACKAGE_SHARDING:--shardIndex 0 --shardCount 4 has failed. Please fix the issues identified (or deflake) before re-applying this label.

auto-submit[bot] avatar Sep 22 '22 22:09 auto-submit[bot]

/var/folders/tn/f_9sf1xx5t14qm_6f83q3b840000gn/T/cirrus-ci-build/packages/video_player/video_player_avfoundation/example/ios/RunnerUITests/VideoPlayerUITests.m:101: error: -[VideoPlayerUITests testEncryptedVideoStream] : ((frames.count >= 3) is true) failed - Must have at least 3 distinct frames.

Unfortunately we can't get the attachments off of Cirrus (that I know of). You could log the png data base64EncodedStringWithOptions and then make a little app to recreate the png from that string on your own machine.

jmagman avatar Sep 22 '22 22:09 jmagman

/var/folders/tn/f_9sf1xx5t14qm_6f83q3b840000gn/T/cirrus-ci-build/packages/video_player/video_player_avfoundation/example/ios/RunnerUITests/VideoPlayerUITests.m:101: error: -[VideoPlayerUITests testEncryptedVideoStream] : ((frames.count >= 3) is true) failed - Must have at least 3 distinct frames.

Unfortunately we can't get the attachments off of Cirrus (that I know of). You could log the png data base64EncodedStringWithOptions and then make a little app to recreate the png from that string on your own machine.

I didn't mean to land that change, I meant to try to figure out why the test was failing in CI. If that test is flaky we really need to fix it so it doesn't keep failing...

jmagman avatar Sep 23 '22 01:09 jmagman

@hellohuanlin this one it's not solved by this PR https://github.com/flutter/flutter/issues/116021

SamerOrfali22 avatar Nov 28 '22 07:11 SamerOrfali22

@SamerOrfali22 As explained in your issue, your video link fails to load even with Apple's vanilla video player. So it is not related to this plugin. You may want to file a radar with Apple here

hellohuanlin avatar Nov 28 '22 19:11 hellohuanlin