Running tests on streaming apps
Developing a video streaming service might seem like nothing: just whomp up a player, take some content from a server, and you’re done. But only after we got down to a project like that, did we realize that it was far from trivial to develop and test.?
Hi! I’m Wladyslaw, and I’m a QA expert at Surf. In this article, I’ll tell you what to look out for when you’re testing a mobile app for an audio or video streaming service.?
This text isn’t about implementing a video streaming service from a technical point of view, but rather product nuances. A couple of years ago this project became a challenge for us: never before has any of us come close to developing and testing a video streaming. Everything I’m writing about is purely my personal experience that I’d love to share with the community.
How do streamings work?
To test video streaming services, you need to understand what they are and how they function.?
Video streaming is the continuous transmission of videos in real time to a user device via the Internet. Here’s how the modern video streaming services work:
Obtaining videos
A widely popular streaming protocol is HLS. The technology was created by the folks at Apple: it’s a standard for their devices.?
The master playlist receives m3u8 media files of acceptable quality from a server. The device itself then picks the required quality from the list, and we receive fragmented files containing videos.
What master playlists might look like for different quality values
Inside master lists are the chunks: they are downloaded via HTTP and used to transfer video content. An audio track may be a separate artifact in the master playlist or it may be integrated into the fragments.
Example content of a 1080p master playlist file. Here you can see .ts files containing videos. Each fragment here is 10 seconds long, but they come in all sizes.
Content Delivery Network (CDN) is a way to receive videos through a direct link to a storage server.
CDN is a geo-distributed network infrastructure. With it, you can optimize the delivery and distribution of content to end users on the Internet.
This one is a great choice for short videos such as ads. The only downside is that you have to wait until the video is fully loaded. The difference isn’t big though: fragments in HLS are 10 seconds long while ads are 15. On the upside, you won’t stumble upon errors while you’re in the middle of it.
What elements make a video streaming app?
Imagine an app is like a cake that has the following layers:
Let’s now take a closer look at each layer.
Network interactions.
Let’s start with initialization: you need to understand where the app gets video files from. There can be two options:
It might so happen that audio and video files come in separate fields: the app then glues them together on the go. That’s why you might need to know how to process errors when the videos start playing.
Processing launch errors
This point is crucial: this is the moment when the user first interacts with the app. Here are the things you need to pay attention to:
Processing errors at playback
Automatically switching to lower quality once the connection gets worse or when the video starts playing in low quality network
The best option is to load a low quality video first and then substitute it for the suitable version on the go. You don’t want to make users wait too long for your content.
Switching quality manually
Subtitles
Though small, this feature is pretty important. You need to decide:?
Player logic
Player logic is what users interact with the most. This block is more about products than technology. We believe that QA ensures product quality in every sense possible. Therefore, it’s crucial to not only make sure that the features comply with tech specs but also make them user-friendly. For example, a button can function as planned, but tapping it may require some extra effort. That’s exactly what QAs have to look out for, since they have access to a project at the stage where you can still pinpoint and change such things.?
Interruptions
Mobile apps can get interrupted in cases such as calls, push notifications, curtains, and transitions to other apps with video or audio files still playing in the background. Obviously, users don’t want the person calling to hear background noises and find out what shows they’re watching :)
It makes sense that the video should pause in such cases. Besides, I’d recommend paying attention to how the video resumes after the interruption: the player control can clearly show the pause button or the video can just resume playing by itself.?
And we’ll be back after our regular rubric called “Curious bugs on iOS”
I came upon this bug as a user of streaming services.?
As soon as the screen time mode was on, the app shut down. After I pressed the “Later” button and temporarily switched off the time limit, the video and audio froze. The player button, however, was in the “pause” position. To resume watching, I had to tap the button either twice (pause-> resume) or four times, which is sort of annoying.
I won’t even tell you about the time lag between the sound and the picture after an interruption :) The picture can also lag behind the sound just a little when you speed up the video, but that’s a whole other thing. Having gained some experience sharing videos on TV, I can tell you that sometimes it crashes and really pisses you off, but that kind of thing occurs more often in browsers than it does in apps.
Screen rotation
Should have turned left, should have turned right, but I ended up wanting you to show me the screen the way I’ve put my phone, so I can feel all right.?
Game time! Just imagine: you’re opening a game app that switches to landscape or you tap the “full screen” button in the video player. Meanwhile, auto rotate is off. Got it? And now the question: which side is your home button on?
Most of my colleagues and I said, “on the right”. Which means, after you’ve opened the game or tapped “full screen”, you’ll automatically turn the device left.
We haven’t specified that part in the tech specs and did it the other way around: the home button had to be on the left. As a result, we had to prove to the team that it’s not my wish as a mean QA but an expected behavior.
领英推荐
Of particular note is the case where auto rotate is off. I’ve seen some Android devices where players just ignore that: they rotate in the direction the user turned them and ignore the fact that rotation is blocked. So, you have to state clearly in the specs what you want to happen in such a case. This will save you from extra bugs, which one side will consider rightfully traced and the other one, based on experience in other apps, won’t.?
Player controls
A player that’s easy to use is a player well made.?
Showing and hiding controls. Focus on the following:
We didn’t think it through, and the controls always disappeared in two seconds — even when the video was on pause. But if users put something on pause, they will want to put it back on. As a result, they’ll have to give the screen an extra tap to see the controls again. That’s one too many steps.?
Fast forward and rewind. Something similar happened to fast forwarding and rewinding with double taps and the player controls. We noticed that in popular apps it works like so: if you’ve just skipped to another fragment and there are still controls present on the screen, each of the following taps works as fast forwarding or rewinding.
Quick example. To fast forward 30 seconds in 10 second increments, you only need to tap 4 times, not 6. Twice to activate fast forwarding and jump 10 seconds forward, and then +10 seconds for each of the following two taps. It’s handy and doesn’t feel like the video is lagging while you’re going forward.?
Controls and open curtains:
Player gestures
And of course, let’s not forget about player gestures: their scope and location. They can be there to hide the frame, add brightness or volume, or they can be specific gestures that call to player features. What matters is they can’t conflict with each other and have to correctly adjust their scope to screens of different dimensions (hello there, iPhone SE).
Saving settings for the next video
Is the state of your settings saved and applied to the next video? By “state” I mean:
If the settings are specified expressly—in the app settings—they have to apply everywhere.?
If they’re only set once in the player, they have to apply to the following videos within the same session.
Player buttons
Logic. There should be no unnecessary interruptions: seamless is best.?
Tap area. Tap areas shouldn’t overlap or be too small. That’s particularly important for buttons near the player timeline, since you might go to the end or beginning of the video by accident.
Subtitles
The subtitles should:
Picture in picture and playback in the background (pip-a-back)
And we’re on to our special and most painful rubric: picture in picture and playback in the background. I’m saying “painful” because we built the streaming on Flutter: native plugins, some extra logic in the native language; in some places it’s a little crooked. Besides, it’s pretty hard to test: there are only few real-life examples.
Why “pip-a-back”: sidenote
In our team, we’ve been calling the feature incorporating pip and playback in the background, pip-a-back. We first implemented PIP, which stands for “picture in picture”, then added playback in the background, and then went on and made PIP and playback in the background available simultaneously.
There aren’t many requirements for PIP and background playback: virtually nothing was customized. Here’s what you have to look out for when you’re testing the feature.
Control logic in these modes: what can they be and look like? Besides, when are they completely unavailable??
Queues: one piece of content should smoothly and seamlessly transition into another — no long pauses and delays.?
If you’re going to integrate ads that can’t be skipped into videos, and they’re going to be available in pip-a-back, make sure that:
The same goes for when the video comes to an end. You need to let the users know what is happening on their screens so that they’re not shocked by the sudden changes.
Playback speed, resolution, and subtitles need to work the same way as they do on a “big” player.
Author’s advice
If you make the pip-a-back mode available, make sure to check it on all the available OSs and wrappers. So many pitfalls and exotic bugs actually depend on the wrapper.?
As for the interactions between PIP and playback in the background, that is some fine art. Sometimes, after one of them is activated, the other one stops working and can only be fixed by restarting the app.?
Player usability
The little things that let users forget that between them and the content, there’s actually a middle layer of an app.?
Earphones. Earphones can do all sorts of sensor, gesture, and voice control extravaganza, and can even put the content on pause when you take one of them out of your ear. They can even play you messages while you’re listening to some content.
Now, imagine this: someone is on a bus listening to a podcast. They take one earphone out to check what the next stop is, and the content they’re listening to starts yelling for all the passengers to hear it. That would be awkward. I’m guessing many people (me included) dread this kind of thing.?
The same goes for connecting to the earphones: what happens to the content? Does the app pause it or keep playing it? And, of course, the buttons on the earphones have to react to the regular gestures and interact with the app accordingly.?
Fast forwarding. Don’t like the ads integrated into videos or just want to skip to the chorus? Fast forwarding is our everything, but only if it doesn’t skip big chunks. It’s not really cool when you skip a whole minute when you just want to go a couple seconds forward, am I right? It’s good when you can set the intervals in the app. Bounce it off your colleagues and the client, do a study, and give proof that it’s not usable. And you’ll be thanked later.
Location of the player buttons and the roles they play. Say, the settings button is in the upper left corner. It’s not in the right spot, is it? If you’re holding a 6.6 inch phone in your right hand and trying to reach the settings button—good luck with that. Even the strongest and bravest have failed at that.
Streaming on TV. First, I didn’t care for casting an app to the TV. But then I gave it a try and I’ll never be the same.
Watching videos on a big screen with your favorite snacks while lying on your couch is the greatest pleasure of all. Don’t take it away from your users—let the team know right from the start. AirPlay for iOS or any other cast for Android guarantees you’ll have no negative user feedback with headlines like “Who’s even watching content on a phone in 2022?”?
It’s important, though, to not only say that you can cast to a TV but also to check it on several TVs. Sure, you’ll have to lean on your colleagues for help, but it’s worth it.
In video streamings, there’s nothing hard about testing code, but there’s a lot of product work
Technology-wise, QAs are expected to do the same thing as always: same old component tests for each player element, plus some specifics of integration with libraries or features.
However, there’ll be plenty of product stuff and headaches with logic: I’m talking small things like “checking the way controls behave in some super unobvious cases” or “checking if the buttons are easy to use when you’re watching a video while lying on a couch”.?
These things are hugely important, though. After all, we’re not looking at the project from the check-if-it-abides-by-the-tech-specs angle, but rather checking if it’s a good enough finished product. If you don’t pay attention to these details, the product will be a win in terms of not having bugs, but a failure in terms of user experience.?
So, my friends, do the things you usually do with code. But as far as all things logic go, team up with products and analysts and put in some elbow grease until you get something cool and pretty.