damus
damus copied to clipboard
Transcode video on the client
@fishcakeday advised:
May I suggest to start with doing the initial transcoding of the video on the client: https://developer.apple.com/documentation/avfoundation/media_reading_and_writing/exporting_video_to_alternative_formats
Library for this:
https://github.com/arthenica/ffmpeg-kit
You do not need libs for this, it is native in iOS
oh nice, this is what chatgpt spit out:
Certainly! Below is a basic example of how you can create a simple video encoder using the built-in iOS libraries in Swift. This example uses AVAssetExportSession to compress a video file.
Example: Compressing a Video with AVAssetExportSession
import AVFoundation
// Function to compress a video
func compressVideo(inputURL: URL, outputURL: URL, completion: @escaping (Result<URL, Error>) -> Void) {
// Create an AVAsset from the input URL
let asset = AVAsset(url: inputURL)
// Create an AVAssetExportSession
guard let exportSession = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetMediumQuality) else {
completion(.failure(NSError(domain: "ExportSessionError", code: -1, userInfo: nil)))
return
}
// Set the output file type and URL
exportSession.outputFileType = .mp4
exportSession.outputURL = outputURL
// Optionally, you can trim the video or specify other settings
// For example, trimming to the first 30 seconds
let timeRange = CMTimeRange(start: .zero, duration: CMTime(seconds: 30, preferredTimescale: 1))
exportSession.timeRange = timeRange
// Perform the export
exportSession.exportAsynchronously {
switch exportSession.status {
case .completed:
completion(.success(outputURL))
case .failed:
if let error = exportSession.error {
completion(.failure(error))
}
case .cancelled:
let error = NSError(domain: "ExportSessionCancelled", code: -2, userInfo: nil)
completion(.failure(error))
default:
let error = NSError(domain: "ExportSessionUnknownError", code: -3, userInfo: nil)
completion(.failure(error))
}
}
}
// Usage Example
let inputURL = URL(fileURLWithPath: "path/to/your/input/video.mov")
let outputURL = URL(fileURLWithPath: "path/to/your/output/video.mp4")
compressVideo(inputURL: inputURL, outputURL: outputURL) { result in
switch result {
case .success(let compressedVideoURL):
print("Video successfully compressed to \(compressedVideoURL)")
case .failure(let error):
print("Video compression failed with error: \(error)")
}
}
Explanation:
AVAsset: Represents the video asset you want to compress. It’s created using theinputURLthat points to the original video file.AVAssetExportSession: Handles the export process. We specify a preset (AVAssetExportPresetMediumQuality) to control the quality of the output video.outputFileType: Defines the format of the output video file. In this case, we use.mp4.timeRange: (Optional) Allows you to trim the video by setting a time range. This example trims the video to the first 30 seconds.exportAsynchronously: Asynchronously performs the compression and export, calling the completion handler when finished.
Considerations:
- Error Handling: The completion handler checks the status of the export session and handles potential errors.
- Output Path: Ensure that the
outputURLpoints to a valid and writable location.
This code provides a basic framework that you can expand upon depending on your needs, such as adding more customization to the compression settings or integrating it with a user interface.
I'm guessing we would need ffmpeg-kit if we wanted to do HLS transcoding on the client.
ChatGPT says we might be able to do it with iOS, but will have to verify:
Yes, you can transcode video for HLS (HTTP Live Streaming) using iOS's built-in libraries, but it's a more complex task than simple video compression. HLS transcoding involves creating multiple renditions (different resolutions and bitrates) of the video and generating an HLS playlist (a .m3u8 file) that references these renditions.
Overview:
To create HLS streams, you’ll need to:
- Segment the video into multiple chunks.
- Generate renditions of the video at different bitrates and resolutions.
- Create an HLS playlist that lists the segments.
Step-by-Step Guide:
1. Set Up the Export Session for HLS:
You can use AVAssetExportSession to create different renditions of your video.
2. Create Multiple Renditions:
Each rendition will be exported with a different preset, such as AVAssetExportPresetLowQuality, AVAssetExportPresetMediumQuality, etc.
3. Generate the HLS Playlist:
Finally, you will need to manually generate the .m3u8 playlist file or use a third-party library to do this, as iOS doesn't provide a built-in way to generate HLS playlists directly.
Example: Basic HLS Transcoding (Conceptual Code)
import AVFoundation
func transcodeForHLS(inputURL: URL, outputDirectory: URL, completion: @escaping (Result<URL, Error>) -> Void) {
// Preset configurations for different renditions
let presets = [
AVAssetExportPresetLowQuality,
AVAssetExportPresetMediumQuality,
AVAssetExportPresetHighestQuality
]
// A list to store output URLs for each rendition
var renditionURLs: [URL] = []
let asset = AVAsset(url: inputURL)
// Dispatch group to wait for all renditions to be completed
let dispatchGroup = DispatchGroup()
for preset in presets {
dispatchGroup.enter()
guard let exportSession = AVAssetExportSession(asset: asset, presetName: preset) else {
completion(.failure(NSError(domain: "ExportSessionError", code: -1, userInfo: nil)))
return
}
let outputURL = outputDirectory.appendingPathComponent(UUID().uuidString).appendingPathExtension("m3u8")
exportSession.outputFileType = .movpkg
exportSession.outputURL = outputURL
exportSession.shouldOptimizeForNetworkUse = true
exportSession.exportAsynchronously {
if exportSession.status == .completed {
renditionURLs.append(outputURL)
dispatchGroup.leave()
} else {
dispatchGroup.leave()
}
}
}
// Once all exports are completed, you can generate the .m3u8 playlist
dispatchGroup.notify(queue: .main) {
do {
let playlistURL = try generateHLSPlaylist(outputDirectory: outputDirectory, renditionURLs: renditionURLs)
completion(.success(playlistURL))
} catch {
completion(.failure(error))
}
}
}
// Function to generate HLS playlist
func generateHLSPlaylist(outputDirectory: URL, renditionURLs: [URL]) throws -> URL {
let playlistContent = """
#EXTM3U
#EXT-X-VERSION:3
"""
var playlist = playlistContent
for (index, renditionURL) in renditionURLs.enumerated() {
playlist += """
#EXT-X-STREAM-INF:BANDWIDTH=\(index * 500000),RESOLUTION=\(1920 - index * 480)x1080,AVERAGE-BANDWIDTH=\(index * 400000)
\(renditionURL.lastPathComponent)
"""
}
let playlistURL = outputDirectory.appendingPathComponent("playlist.m3u8")
try playlist.write(to: playlistURL, atomically: true, encoding: .utf8)
return playlistURL
}
// Example of calling the HLS transcoding function
let inputURL = URL(fileURLWithPath: "path/to/your/input/video.mov")
let outputDirectory = URL(fileURLWithPath: "path/to/your/output/directory")
transcodeForHLS(inputURL: inputURL, outputDirectory: outputDirectory) { result in
switch result {
case .success(let playlistURL):
print("HLS playlist successfully created at \(playlistURL)")
case .failure(let error):
print("HLS transcoding failed with error: \(error)")
}
}
Explanation:
-
Presets for Renditions:
- Different quality presets (
AVAssetExportPresetLowQuality,AVAssetExportPresetMediumQuality, etc.) are used to create different renditions of the video. These correspond to different bitrates and resolutions.
- Different quality presets (
-
Dispatch Group:
- A
DispatchGroupis used to wait for all export sessions to complete before generating the final.m3u8playlist file.
- A
-
HLS Playlist Generation:
- The
.m3u8playlist file is manually generated, listing all the video renditions. Each line includes information about the bandwidth and resolution.
- The
-
Output Files:
- The output HLS segments and playlist files are saved in the specified directory.
Considerations:
-
File Format:
- HLS segments are typically
.ts(MPEG-2 Transport Stream) files, but this example uses.movpkgas a placeholder.
- HLS segments are typically
-
Optimization:
- For a production-level HLS transcoder, you may need more sophisticated error handling, segmenting with correct durations, and possibly encryption.
-
HLS Features:
- Advanced HLS features like alternate audio tracks, subtitles, and encryption are not covered here.
Final Thoughts:
- While iOS does not have a native API for HLS transcoding out of the box, you can implement it using
AVAssetExportSessionin combination with custom code to generate the necessary playlists. For more complex and production-level HLS transcoding, you might need a more comprehensive solution, possibly involving server-side processing or third-party libraries.
I replied on nostr, and suggest against doing VBR on the client. For AV native lib, please take a look at presets and also make sure you have network optimized setting. I think I suggested medium quality and network optimized
.ts is an old crap, and I am more likely to do CMAF which will work for both DASH and HLS in one go.
CMAF which will work for both DASH and HLS in one go.
TIL! sounds good to me