At Unity, we continually work to ensure our ironSource SDK is best-in-class. But with a huge amount of scale - more than 1 billion devices - ensuring a top SDK takes a village.
Quality is a two part process - we need to (1) build a strong product for our developers, and that product needs to (2) ensure a good experience for users of our developers’ apps. And as we work to keep our developers and their users happy, the market continually pushes toward new and updated features.
In this industry, there’s always a delicate balance between progression (e.g., testing for new software) and regression (e.g., confirming our software is still working well). Essentially, we can’t keep manually developing new features without supporting the features - and clients - we already have.
Instead of needing to increase our resources, we can balance progression and regression with automation. Normally, you would have to manage a tradeoff between quality and quantity. But with automation, we don’t have to sacrifice one for the other - we can optimize toward progression and regression at the same time.
So, how do we manage to scale while meeting and exceeding market demand and carrying the weight of hundreds of millions of daily active users? By ensuring every layer of our SDK delivery process is automated.
Here’s how we use automation to ensure every one of our SDK integrations runs smoothly - from our developers and quality engineers, to our integration team.
Layer 1: Developers
Let’s start with developers because they’re the ones writing the code. From the moment our Unity LevelPlay developers start coding, we always have quality in mind - every part of our SDK is backed up with unit tests. Unit tests are basic, automated tests that ensure the SDK components are working well.
Unit tests are constantly running, automatically checking if the API (application programming interface) is triggered with input, then output is as expected. Essentially, the tests confirm that the API is communicating properly. For example, if we want to integrate a new ad format into our new SDK version, the unit test would confirm that each ad format is presenting and working well.
In fact, we’re not just ensuring that the API information is being communicated - we’re confirming that the information is consistently accurate. For example, within Unity LevelPlay mediation, there are different ad network requests popping up at all times. As the bidding system triggers an auction for the top ad network, it’s critical that every piece of data in this moving puzzle is up-to-date and accurate. If one piece is not accurate, it can affect the entire funnel, so our automated tests ensure we can keep a close eye on all of the moving pieces.
This can be a very tedious task - highlighting why automation is so important. One of the biggest roles of automation infrastructure, particularly unit tests, is testing data all the time. Unit tests certainly cover regression, but not necessarily progression - so that’s where the next level of testing, integration testing, comes in.
Layer 2: Quality Engineers
The next layer of progression is covered by the quality engineers, who specialize in a variety of testing - particularly integration tests. Integration tests automatically run on a daily basis, but instead of checking if our SDK component tests are working, integration tests check how these components work together. Continuing with our previous example, if we want to add a new ad format to our newest SDK version, quality engineers would set up an integration test, systematically checking how this ad format might interact with other ad formats.
Even for automated tests, this can be a tedious process - with so many SDK components, there are an endless number of ways they could combine and interact. That’s why our SDK quality engineers tend to use the 80/20 rule, or testing the top 20% most common interactions to account for 80% of the combination scenarios. The larger the ground to cover, the higher the possibility of technical issues, so our quality engineers are encouraged to be hyper skeptical - and also assume there’s a technical issue, even if there’s not.
Layer 3: Integration team
Let’s say the developers and quality engineers have already given this new SDK version the green light, including the new ad format. Even though this new version is ready for launch, it might not be able to thrive in every scenario - for example, it might not work well in a few countries that only have 3G.
Before we release this new SDK update to developers, our integration team uses alpha apps, internal production apps made with in-house tech, to measure how the update will perform in a real scenario - both from the developer’s and user’s perspective. In our example, the integration team would test the SDK on real traffic in these countries, using the same tools that developers use, just in a closed environment.
Both integrating the SDK into an alpha app and uploading the app into the store are fully automated processes. In many ways, the integration team works like a production line, with many automated steps along the way.
Once the new SDK version is live, we can continually make adjustments to ensure the best quality user experience. As we develop many of our new SDK features, we include a toggle option - so if one feature ever becomes faulty for a certain audience, we can turn it off if needed.
Combined, these three layers of testing ensure that we have a fortified automation infrastructure - ensuring that our products are high quality for every one of our many users. The automation process grants us the biggest gift possible - time - which we can use to focus on progression, and developing new and innovative features to keep our SDK best-in-class.