As any native app developer will tell you, API responsiveness and application performance are directly correlated with a positive user experience — meaning, when those things are running smoothly, your fans will be happy, and when they’re running less well, they notice. Because Vimeo’s app is so dependent on network requests, we investigated ways in which we could improve load time. While there were many areas in the request lifecycle that we examined, we focused on the parsing of JSON responses.
Gson is ON
The Vimeo Android app uses Retrofit for its networking layer and Gson for deserialization. One downside to this approach is that it can be quite slow, as Gson uses reflection to turn JSON into model objects. So to improve that deserialization time, we wanted to try removing that reflection. And we’re not the only ones who have realized how important this is: across industries , people are starting to recognize the cost of slow reflection.
In order to avoid reflection, we created custom Gson TypeAdapters. These allow us to control how data is parsed and provides us with a faster alternative. We have many models in our networking layer, and we chose a few to quantify the effect of reflection-less (de)serialization. The table below shows these models in terms of their data size.
|Model size (KB)||Model Type||Device||Method||Reflection (ms)||Custom TypeAdapter (ms)||Performance Gain|
All times are averages over 3 runs on the main thread
We used a high-end tablet for testing, knowing that if we saw gains there, lower-end devices would also benefit. Looking at the chart above, we can see that in many cases, primarily when the data is not as large, using custom TypeAdapters was faster than using reflection. The outlier was one of our heaviest models: on a high-end device such as the Nexus 9, reflection was faster than a custom TypeAdapter. But because profiling on a lower-end device showed us that we were still able to cut down on parsing time — and since not everyone has a top-shelf device — we decided it was still in our members’ best interests to use custom TypeAdapters.
The models we used in the table above contained nested objects, and while it was pretty boilerplate, it amounted to 3K lines of additional code! We weren’t thrilled with the idea of writing all that code by hand, and one of our engineers had a great idea: why not generate it at compile time? Enter STAG. STAG stands for Speedy Type Adapter Generation, and it does just that.
STAG is an annotation processor. It works by looking for a specific annotation (GsonAdapterKey) on class member variables that you want to (de)serialize using Gson. If it finds this annotation on a member variable, it will create a TypeAdapter for that class, generating code for that member variable and any other annotated fields.
In our networking layer, we have abstract classes that use generics, so we made sure to accommodate for them. When you add the annotation to a concrete subclass, STAG will create a TypeAdapter for that class, thereby incorporating any annotated members of its parent — even those that are generic.
If you already use the Gson SerializedName annotation, taking advantage of this library is as simple as replacing that annotation with the GsonAdapterKey one. We were using SerializedName on many of our models, so incorporating STAG was fairly easy. If you’re not using the SerializedName annotation, simply add the annotation to the appropriate models.
Ready, set, deserialize
Want to get started? And improve overall app performance and user sentiment? STAG is open-sourced ( https://github.com/vimeo/stag-java ) and available now. No one should need to go through the headache of writing custom TypeAdapters, so for anyone using Gson, you can drop in this library and start raking in the benefits from performant, reflection-less (de)serialization.