Build a complete app from scratch in a week-end? Challenge accepted! I started the project on Friday evening and submitted the app for review on the App Store on Sunday evening. It was accepted during the night and live on Monday morning.
So what was there time to do in just a weekend? Actually, a lot:
SwiftUI: I experimented a tiny bit of SwiftUI before, but nothing compares to building a real app!
GPS based location is retrieved in the background, reverse geocoded and stored in CoreData
Forms: automatically recording locations is cool, but you sometimes have to go and manually add or edit visits, SwiftUI makes it particularly easy to create dynamic forms with very little code and handles pretty much everything for you
In-App Purchase with a paywall selling a non-consumable product to unlock some features like yearly / country reports and PDF export
PDF generation from HTML templates, and export using a SwiftUI wrapped version UIActivityController
Analytics and Crashlytics integration using Firebase
App Store preparation: finding a name, writing description, creating a logo, preparing screenshots, submitting the In-App Purchase product
Submission!
What kind of issues did I run into?
SwiftUI is fun, but sometimes it’s just really hard to do something really simple (like presetting two modals / sheets from the same view), and sometimes you simply can’t do something simple (like removing the disclosure indicator from list cells without generating random crashes)
Fastlane didn’t want to work on my machine, I wanted to automate the screenshots but couldn’t, but it’s ok, since there is only one language for now and the app only support the iPhone, taking screenshots manually wasn’t too long
Apple randomly disabled submission from the version of Xcode available on the Mac App Store, and obviously the error message when submitting was super obscure… had to download the next coming GM from the developer download center
Is the code clean? Hell no! But was it fun to do? Absolutely! I don’t know if this app will ever sustain itself, but I’ve to admit we live in a time where very powerful tools are available for us to experiment and iterate really quickly. I’ll definitely do this kind of challenges again 🙂
When creating a custom video player, you need to have a component halfway between a UISlider, allowing your to interactively track and seek a particular position in the video, but also show progress continuously while also eventually showing buffering.
Even though there isn’t any component like this directly available in UIKit, it appears it’s fairly easy to make something like this, interactive and fully customizable:
Creating a custom view to display progress and buffer
Let’s start with the basics, we need a subclass of UIView for which we override all possible initializers:
At this stage, nothing appears in interface builder yet, this is normal, we didn’t add anything to our view yet.
The progress view will be composed of several layers: the track in the background, the buffer one level up, the actual progress one level up and finally the knob or thumb.
Let’s create and add our subviews:
private lazy var trackView: UIView = {
let view = UIView()
view.translatesAutoresizingMaskIntoConstraints = false
view.layer.masksToBounds = true
return view
}()
private lazy var progressView: UIView = {
let view = UIView()
view.translatesAutoresizingMaskIntoConstraints = false
return view
}()
private lazy var bufferView: UIView = {
let view = UIView()
view.translatesAutoresizingMaskIntoConstraints = false
return view
}()
private lazy var thumbView: UIView = {
let view = UIView()
view.translatesAutoresizingMaskIntoConstraints = false
view.layer.masksToBounds = true
return view
}()
private var trackHeightConstraint: NSLayoutConstraint!
private var thumbHeightConstraint: NSLayoutConstraint!
private var progressViewWidthConstraint: NSLayoutConstraint!
private var bufferViewWidthConstraint: NSLayoutConstraint!
The layout using auto layout, and we observe these all depend on some customizable properties, that we will mark as @IBInspectable so we can modify them directly from Interface Builder inspector:
Finally, it’s important to override both layoutSubviews to make sure our subviews are property placed when screen size changes (final size, rotations, etc) and the intrinsicContentSize, especially because we want the height to automatically be decided based on our track height and thumb height instead of adding a constraint ourselves:
When playing a video in an AVPlayer, you sometimes want to be aware of the buffering in order to update your interface, for instance you can:
show an activity indicator when the player stalls due to buffering
create your own progress bar and show in a different color than the progression the point up to where the video is loaded
(Note that in the following examples, I consider being at the view model level and update dynamic properties that could be observed by a view controller using KVO to react and update the interface, try using reactive programming with RxSwift or Combine instead).
Detecting changes in buffer state
In order to show an activity indicator when the player stalls, we need to register 3 observers using KVO (Key-Value Observing) on the following dynamic properties of an AVPlayerItem:
When the updates are receiving, we can then react accordingly:
isPlaybackBufferEmpty = true: the player has to fill the buffer, definitely stalling, this is a good place to start the activity indicator
isPlaybackBufferFull = true: the player has filled the buffer completely, at this stage it has more than enough to play, not stalling, the activity indicator must be stopped
isPlaybackLikelyToKeepUp = true: the player has filled enough of the buffer to start playing, at this stage, it will restart playing if not paused and is not stalling, the activity indicator can be stopped
Detecting up to what point of the video is buffered
In order to know and convert the loading time ranges into a percentage of the video, we will need to retrieve and extract different pieces of information:
the video duration
the available times aka what’s been loaded already
Getting video duration
For the duration of the video, again, an observer on the duration property of the AVPlayerItem and using KVO will do the trick:
At the AVPlayer level, we can add a periodic time observer that will call our callback as close as the requested interval as possible, in the following case every half-second:
let player = AVPlayer(playerItem: playerItem)
player.addPeriodicTimeObserver(forInterval: CMTime(seconds: 0.5, preferredTimescale: CMTimeScale(NSEC_PER_SEC)),
queue: DispatchQueue.main,
using: handleTimeChanged)
So every 1/2 second, we enter our callback, and this is a good place to refresh our local representation of the buffer:
The AVPlayerItem has a property called loadedTimeRanges that has everything we need. We get its timeRangeValue if it exist, and then compose the available duration.
Based on the video duration, we can transform the available duration (or buffered duration) into a percentage of the video:
@objc private(set) dynamic var buffered: Float = 0.0
private func refreshBuffered() {
buffered = Float(availableDuration / duration)
}
private var availableDuration: Double {
guard let timeRange = player.currentItem?.loadedTimeRanges.first?.timeRangeValue else {
return 0.0
}
let startSeconds = timeRange.start.seconds
let durationSeconds = timeRange.duration.seconds
return startSeconds + durationSeconds
}
This is not perfect, obviously the buffer doesn’t contain the whole video data between 0.0 and availableDuration, but this is good enough to show on a UIProgressView.
Bonus: refreshing progression
Because we receive periodic time updates, it is also a good place to update our progression in the model, here after I do it in two forms:
currentTime (TimeInterval) to be formatted and displayed in a label
progress (Float) to configure a UISlider and see progress visually
In iOS 13, Apple introduced Low Data Mode, an option users can enable on a per network basis to nicely ask the system and developers to consume as little data as possible.
It’s then the developer responsibility to conform to this preference. This option is easy to implement when using URLSession, but there is also a way to observe the status of the current network path, and the changes that may occur while your application or your screen is active using the Network framework:
import Network
let monitor = NWPathMonitor()
monitor.pathUpdateHandler = { path in
if path.isConstrained {
// Current network is constrained by user preference, aka iOS 13+ Low Data Mode
} else if path.isExpensive {
// Current network is considered expensive (eg: cellular, hotspot)
} else {
// Current network hasn't anything special, most likely is WiFi
}
}
monitor.start(queue: DispatchQueue.global(qos: .background))
The compiler refuses to compile and prompts the following error: Ambiguous reference to member 'buildBlock()'.
This is due to the view building system in SwiftUI internally having code to let you add 1 view, 2 views, 3 views… up to 10 views, but nothing for 11 and beyond.
The solution is to wrap up to 10 elements in a Group, it allows you to break your components structure, for example: