AVPlayer buffering

When playing a video in an AVPlayer, you sometimes want to be aware of the buffering in order to update your interface, for instance you can:

  • show an activity indicator when the player stalls due to buffering
  • create your own progress bar and show in a different color than the progression the point up to where the video is loaded

(Note that in the following examples, I consider being at the view model level and update dynamic properties that could be observed by a view controller using KVO to react and update the interface, try using reactive programming with RxSwift or Combine instead).

Detecting changes in buffer state

In order to show an activity indicator when the player stalls, we need to register 3 observers using KVO (Key-Value Observing) on the following dynamic properties of an AVPlayerItem:

  • isPlaybackBufferEmpty
  • isPlaybackBufferFull
  • isPlaybackLikelyToKeepUp
@objc private(set) dynamic var isStall: Bool = false

// MARK: - Buffering KVO

private var isPlaybackBufferEmptyObserver: NSKeyValueObservation?
private var isPlaybackBufferFullObserver: NSKeyValueObservation?
private var isPlaybackLikelyToKeepUpObserver: NSKeyValueObservation?

private func observeBuffering(for playerItem: AVPlayerItem) {
    isPlaybackBufferEmptyObserver = playerItem.observe(\.isPlaybackBufferEmpty, changeHandler: onIsPlaybackBufferEmptyObserverChanged)
    isPlaybackBufferFullObserver = playerItem.observe(\.isPlaybackBufferFull, changeHandler: onIsPlaybackBufferFullObserverChanged)
    isPlaybackLikelyToKeepUpObserver = playerItem.observe(\.isPlaybackLikelyToKeepUp, changeHandler: onIsPlaybackLikelyToKeepUpObserverChanged)
}

private func onIsPlaybackBufferEmptyObserverChanged(playerItem: AVPlayerItem, change: NSKeyValueObservedChange<Bool>) {
    if playerItem.isPlaybackBufferEmpty {
        isStall = true
    }
}

private func onIsPlaybackBufferFullObserverChanged(playerItem: AVPlayerItem, change: NSKeyValueObservedChange<Bool>) {
    if playerItem.isPlaybackBufferFull {
        isStall = false
    }
}

private func onIsPlaybackLikelyToKeepUpObserverChanged(playerItem: AVPlayerItem, change: NSKeyValueObservedChange<Bool>) {
    if playerItem.isPlaybackLikelyToKeepUp {
        isStall = false
    }
}

When the updates are receiving, we can then react accordingly:

  • isPlaybackBufferEmpty = true: the player has to fill the buffer, definitely stalling, this is a good place to start the activity indicator
  • isPlaybackBufferFull = true: the player has filled the buffer completely, at this stage it has more than enough to play, not stalling, the activity indicator must be stopped
  • isPlaybackLikelyToKeepUp = true: the player has filled enough of the buffer to start playing, at this stage, it will restart playing if not paused and is not stalling, the activity indicator can be stopped

Detecting up to what point of the video is buffered

In order to know and convert the loading time ranges into a percentage of the video, we will need to retrieve and extract different pieces of information:

  • the video duration
  • the available times aka what’s been loaded already

Getting video duration

For the duration of the video, again, an observer on the duration property of the AVPlayerItem and using KVO will do the trick:

@objc private(set) dynamic var duration: TimeInterval = 0.0

// MARK: - Duration KVO

private var durationObserver: NSKeyValueObservation?

private func observeDuration(for playerItem: AVPlayerItem) {
    durationObserver = playerItem.observe(\.duration, changeHandler: { [weak self] (playerItem, _) in
        self?.duration = playerItem.duration.seconds
    })
}

Receiving periodic time updates

At the AVPlayer level, we can add a periodic time observer that will call our callback as close as the requested interval as possible, in the following case every half-second:

let player = AVPlayer(playerItem: playerItem)
player.addPeriodicTimeObserver(forInterval: CMTime(seconds: 0.5, preferredTimescale: CMTimeScale(NSEC_PER_SEC)),
                               queue: DispatchQueue.main,
                               using: handleTimeChanged)

So every 1/2 second, we enter our callback, and this is a good place to refresh our local representation of the buffer:

private func handleTimeChanged(time: CMTime) {
    refreshBuffered()
    refreshProgression(time: time)
}

Refreshing loaded buffer

The AVPlayerItem has a property called loadedTimeRanges that has everything we need. We get its timeRangeValue if it exist, and then compose the available duration.

Based on the video duration, we can transform the available duration (or buffered duration) into a percentage of the video:

@objc private(set) dynamic var buffered: Float = 0.0

private func refreshBuffered() {
    buffered = Float(availableDuration / duration)
}

private var availableDuration: Double {
    guard let timeRange = player.currentItem?.loadedTimeRanges.first?.timeRangeValue else {
        return 0.0
    }
    let startSeconds = timeRange.start.seconds
    let durationSeconds = timeRange.duration.seconds
    return startSeconds + durationSeconds
}

This is not perfect, obviously the buffer doesn’t contain the whole video data between 0.0 and availableDuration, but this is good enough to show on a UIProgressView.

Bonus: refreshing progression

Because we receive periodic time updates, it is also a good place to update our progression in the model, here after I do it in two forms:

  • currentTime (TimeInterval) to be formatted and displayed in a label
  • progress (Float) to configure a UISlider and see progress visually
@objc private(set) dynamic var currentTime: TimeInterval = 0.0
@objc private(set) dynamic var progress: Float = 0.0

private func refreshProgression(time: CMTime) {
    currentTime = time.seconds
    progress = Float(currentTime / duration)
}

Observing Low Data Mode and other expensive network paths

In iOS 13, Apple introduced Low Data Mode, an option users can enable on a per network basis to nicely ask the system and developers to consume as little data as possible.

It’s then the developer responsibility to conform to this preference. This option is easy to implement when using URLSession, but there is also a way to observe the status of the current network path, and the changes that may occur while your application or your screen is active using the Network framework:

import Network

let monitor = NWPathMonitor()
monitor.pathUpdateHandler = { path in
    if path.isConstrained {
        // Current network is constrained by user preference, aka iOS 13+ Low Data Mode
    } else if path.isExpensive {
        // Current network is considered expensive (eg: cellular, hotspot)
    } else {
        // Current network hasn't anything special, most likely is WiFi
    }
}
monitor.start(queue: DispatchQueue.global(qos: .background))

(Swift 5.1 / iOS 13.1)

SwiftUI Ambiguous reference to member ‘buildBlock()’

So this work:

import SwiftUI

struct ContentView: View {
    var body: some View {
        VStack {
            Text("Text 1")
            Text("Text 2")
            Text("Text 3")
            Text("Text 4")
            Text("Text 5")
            Text("Text 6")
            Text("Text 7")
            Text("Text 8)
            Text("Text 9")
            Text("Text 10")
        }
    }
}

But this doesn’t:

import SwiftUI

struct ContentView: View {
    var body: some View {
        VStack {
            Text("Text 1")
            Text("Text 2")
            Text("Text 3")
            Text("Text 4")
            Text("Text 5")
            Text("Text 6")
            Text("Text 7")
            Text("Text 8)
            Text("Text 9")
            Text("Text 10")
            Text("Text 11")
        }
    }
}

The compiler refuses to compile and prompts the following error: Ambiguous reference to member 'buildBlock()'.

This is due to the view building system in SwiftUI internally having code to let you add 1 view, 2 views, 3 views… up to 10 views, but nothing for 11 and beyond.

The solution is to wrap up to 10 elements in a Group, it allows you to break your components structure, for example:

import SwiftUI

struct ContentView: View {
    var body: some View {
        VStack {
            Group {
                Text("Text 1")
                Text("Text 2")
                Text("Text 3")
                Text("Text 4")
                Text("Text 5")
                Text("Text 6")
                Text("Text 7")
                Text("Text 8)
                Text("Text 9")
                Text("Text 10")
            }
            Text("Text 11")
        }
    }
}

Sort by Name

Simple trick in Xcode, but important nevertheless since keeping things organized is crucial for you and your colleagues.

For instance, you can transform the order of these files:

into this:

by right-clicking on the Views folder and select Sort by Name:

Preserving a minimum tappable area for UIButton

One of the Apple Human Interface Guideline says:

Provide ample touch targets for interactive elements. Try to maintain a minimum tappable area of 44pt x 44pt for all controls.

(https://developer.apple.com/design/human-interface-guidelines/ios/visual-design/adaptivity-and-layout/)

But sometimes, your graphic designer only gives you a tiny image and:

  1. you don’t want to deal with the button insets in your layout
  2. you still want your users to have a reasonable hit zone

The solution is to override the func hitTest(_:, with:) -> UIView? method of the UIButton, so that it returns said button even if the tap if slightly out of bounds (as long as this matches the 44×44 points rule):

import UIKit
class MinimumTouchAreaButton: UIButton {
override func hitTest(_ point: CGPoint, with _: UIEvent?) -> UIView? {
guard !isHidden, isUserInteractionEnabled, alpha > 0 else {
return nil
}
let expandedBounds = bounds.insetBy(dx: min(bounds.width – 44.0, 0), dy: min(bounds.height – 44.0, 0))
return expandedBounds.contains(point) ? self : nil
}
}