Speed was a company-wide initiative for eBay in 2019, with many teams determined to make the
site and apps as fast as possible for users. In fact, for every 100 milliseconds improvement in
search page loading time, eBay saw a 0.5% increase in “Add to Cart” count.
Improvement in load time
Increase in “Add to Cart” count
Through the adoption of Performance Budgets (derived
after doing a competitive study with the Chrome User Experience
Report) and a focus on key
user-centric performance metrics, eBay was able to make
significant improvements to site speed.
…and their Chrome User Experience Report data highlights these improvements, too.
There’s still more work ahead but here’s eBay’s learnings so far.
Web Performance “cuts”
The improvements eBay made were possible due to the reduction or “cuts” (in the size and time) of
various entities that take part in a user’s journey. This post covers topics that are relevant to
the web developer community at large, rather than eBay-specific topics.
Reduce payload across all text resources
One way to make sites fast is to simply load less code. eBay reduced their text payloads by trimming
HTML, and JSON responses served to users. Previously, with every new feature, eBay kept increasing
the payload of their responses, without cleaning up what was unused. This added up over time and
became a performance bottleneck. Teams usually procrastinated on this cleanup activity, but you’d
be surprised by how much eBay saved.
The “cut” here is the wasted bytes in the response payload.
Critical path optimization for above-the-fold content
Not every pixel on the screen is equally important. The content above-the-fold is more
critical than something below-the-fold. Native and web apps
are aware of this, but what about services? eBay’s service architecture has a layer called
which the frontends (native apps and web servers) talk to.
This layer is specifically designed to be view- or device-based, rather than entity-based like item,
user, or order. eBay then introduced the concept of the critical path for Experience Services.
When a request comes to these services, they work on getting the data for above-the-fold
content immediately, by calling other upstream services in parallel. Once data is ready, it is
The below-the-fold data is sent in a later chunk or lazy-loaded. The outcome: users get to see
above-the-fold content quicker.
The “cut” here is the time spent by services to display relevant
Images are one of the largest contributors to page
small optimizations go a long way. eBay did two optimizations for images.
First, eBay standardized on the WebP image format for search
results across all platforms, including iOS, Android, and supported browsers. The search
results page is the most image-heavy page at eBay, and they were already using WebP, but not in a
Second, though eBay’s listing images are heavily optimized (in both size and format), the same rigor
did not apply for curated images (for example, the top module on the
homepage). eBay has a lot of hand-curated images, which are uploaded
through various tools. Previously the optimizations were up to the uploader, but now eBay enforces
the rules within the tools, so all images uploaded will be optimized appropriately.
The “cut” here is the wasted image bytes sent to users.
Predictive prefetch of static assets
A user session on eBay is not just one page. It is a flow. For example, the flow can be a navigation from the homepage to a search page to an item page. So why don’t pages in the flow help each other? That is the idea of predictive prefetch, where one page prefetches the static assets required for the next likely page.
Prefetching top search results
When a user searches eBay, eBay’s analytics data suggests that it is highly likely that the user
will navigate to an item in the top 10 of the search results. So eBay now prefetches the
items from search and keeps them ready for when the user navigates. The prefetching happens at two levels.
The first level happens server-side, where the item service caches the top 10 items in search results. When the user
goes to one of those items, eBay now saves server processing time. Server-side caching is leveraged by
native apps and is rolled out globally.
The other level happens in the browser cache, which is available
in Australia. Item prefetch was an advanced optimization due to the dynamic nature of items. There
are also many nuances to it: page impressions, capacity, auction items, and so on. You can learn more
about it in LinkedIn’s Performance Engineering Meetup
presentation, or stay tuned for a detailed blog
post on the topic from eBay’s engineers.
The “cut” here can either be server processing time or network time,
depending on where the item is cached.
Eager downloading of search images
In the search results page, when a query is issued at a high level, two things happen. One is the recall/ranking step, where the most relevant items matching the query are returned. The second step is augmenting the recalled items with additional user-context related information such as shipping costs.
eBay now immediately sends the first 10 item images to the browser in a chunk along with the header, so the downloads can start before the rest of the markup arrives. As a result, the images will now appear quicker. This change is rolled out globally for the web platform.
The “cut” here is the download start time for search result images.
Edge caching for autosuggestion data
When users type in letters in the search box, suggestions pop-up. These suggestions do not change for letter combinations for at least a day. They are ideal candidates to be cached and served from a CDN (for a max of 24 hours), instead of requests going all the way to a data center. International markets especially benefit from CDN caching.
There was a catch, though. eBay had some elements of personalization in the suggestions pop-up,
which can’t be cached efficiently. Fortunately, it was not an issue in the native apps, as the user
interface for personalization and suggestions could be separated. For the web, in international
markets, latency was more important than the small benefit of personalization. With that out of the
way, eBay now has autosuggestions served from a CDN cache globally for native apps and non-US
markets for eBay.com.
The “cut” here is the network latency and server processing time for
Edge caching for unrecognized homepage users
For the web platform, the homepage content for unrecognized users is the same for a particular region. These are users who are either using eBay for the first time or starting a fresh session, hence no personalization. Though the homepage creatives keep changing frequently there is still room for caching.
eBay decided to cache the unrecognized user content (HTML) on their edge network (PoPs) for a short period. First-time users can now get homepage content served from a server near them, instead of from a faraway data center. eBay is still experimenting with this in international markets, where it will have a bigger impact.
The “cut” here is again both network latency and server processing time for unrecognized users.
Optimizations for other platforms
Native app parsing improvements
Native apps (iOS and Android) talk to backend services whose response format is typically JSON. These JSON payloads can be large. Instead of parsing the whole JSON to render something on the screen, eBay introduced an efficient parsing algorithm that optimizes for content that needs to be displayed immediately.
Users can now see the content quicker. In addition, for the Android app, eBay starts initializing the search view controllers as soon as the user starts typing in the search box (iOS already had this optimization). Previously this happened only after users pressed the search button. Now users can get to their search results faster. The “cut” here is the time spent by devices to display relevant content.
Native app startup time improvements
This applies to cold start time optimizations for native apps, in particular, Android. When an app is cold started, a lot of initialization happens both at the OS level and application level. Reducing the initialization time at the application level helps users see the home screen quicker. eBay did some profiling and noticed that not all initializations are required to display content and that some can be done lazily.
More importantly, eBay observed that there was a blocking third-party analytics call that delayed the rendering on the screen. Removing the blocking call and making it async further helped cold start times. The “cut” here is the unnecessary startup time for native apps.
All the performance “cuts” eBay made collectively contributed towards moving the needle, and it happened over a period of time. The releases were phased in throughout the year, with each release shaving off tens of milliseconds, ultimately reaching the point where eBay is now:
Performance is a feature and a competitive advantage. Optimized experiences lead to higher user engagement, conversions, and ROI. In eBay’s case, these optimizations varied from things that were low-effort to a few that were advanced.
Check out Speed by a thousand cuts to learn more and be on the lookout for more detailed articles by eBay engineers on their performance work in the near future.