Accelerating Ahead Mastering Performance Benchmarking for Competitive Edge: On-Demand Webinar

  • 2 months ago
Watch our on-demand webinar titled "Accelerating Ahead: Mastering Performance Benchmarking for Competitive Edge." Explore essential strategies for optimizing application speed to outpace competitors. Our session delves deep into performance benchmarking intricacies, providing actionable solutions and showcasing their pivotal role in gaining competitive advantage in today's dynamic market.

To know more about HeadSpin, visit: https://www.headspin.io/
Transcript
00:00So, yeah, today we'll be talking about mastering performance benchmarking to get competitive
00:08edge across your competitors, right?
00:14So introducing myself, I'm Alif, I've been working at EdgeSpin as the solutions and offerings
00:20manager since three plus years.
00:25I've been helping customers with various solutions, one of them being a benchmarking solution.
00:32So the main agenda behind that solution is to identify the key KPIs and user flows for
00:40various customers, and basically to build a comparative view of their performance versus
00:47the competitor performance, and then help identify them help areas where they are lagging
00:53behind and to provide insights on how to improve these KPIs or their performance.
01:04So the whole setup works like a project to improve the end user experiences of the overall
01:12application.
01:14So here, we'll be starting with a story of a video on demand platform that is Shudder.
01:24And then we'll be talking about how Shudder as an application can expand to various markets
01:33by delivering an impactful user experience.
01:37Then we'll talk about how Headspin does it, what's in it for Headspin, how we provide
01:46a benchmarking solution across different industries, then what are the steps involved in benchmarking.
01:55Then we'll follow with some success stories and some sample dashboards that we have captured
02:03across some of our projects.
02:08So yeah, starting with the short story of Shudder application, Shudder application is
02:14an American based OTT platform that's basically focused on video on demand featuring horror,
02:24thriller, and supernatural fiction titles.
02:28This is an entity owned by AMC Networks.
02:31They began their service only in United States in around 2015.
02:38And later they expanded to Canada, UK, Ireland, and New Zealand.
02:43They currently have 10 million plus active subscribers with an annual revenue of almost
02:5118.5 million per year.
02:55So currently, if you look at this industry, there are a lot of competing tools, right?
03:02When you think about Netflix, Hotstar, or even Amazon Prime, these are the major players
03:11that are booming in this industry.
03:13They are very influential as well as the main reason being they have presence across many
03:20locations across the globe.
03:23So when thinking about the genre of horror that Shudder is looking into, this specific
03:33genre has a very good opportunity in locations like Japan, Korea, India, Turkey, et cetera,
03:44where Netflix, Amazon Prime, Disney, et cetera are ruling with their reach.
03:50So there is specifically a very potential growth opportunity for Shudder across these
03:56locations.
03:58Now looking at the competitors, Netflix, Hotstar, and Prime, they are the largest and they are
04:04the most used platform across these locations globally.
04:09There is their exceptional user experience, that is one.
04:15And also they serve across many other genres.
04:23Then for a platform like Shudder to grow across these territories, I mean, these locations
04:31for them to scale, they have to provide two basic functions.
04:38One is from a functional perspective, it should ideally offer all the necessary features that
04:44are offered by competition.
04:46For example, some of the key features that you think would be very simple features but
04:54are very effective, for example, Netflix, they have user profiles, relevant and personalized
05:03content recommendations, easy to search, then to find specific content the customer is looking
05:11for, then a basic function like skip intro whenever we open a series or a web-based content,
05:21then watch parties that help multiple users to watch a specific video at the same time.
05:30So basically these are the functional aspects.
05:34And when talking about an experience point of view, it should ideally offer a faster
05:39experience to the customers.
05:41In terms of page load time, if it's a video, then the video streaming should be optimal.
05:47It should be consistent across devices.
05:50Then there should be minimal loading animations, et cetera.
05:56And it should also perform well across different network conditions.
06:01So these are the basic factors that sets initial impression on the users, right?
06:08So quantifying these metrics and continuously monitoring these metrics across various devices,
06:15various locations is a great challenge for any such company that has a mobile application.
06:26So when looking at big challenges or the steps that has to be followed in order to provide
06:35a very good user experience, first is to choose kind of metrics.
06:42It is always a struggle to identify the right metrics that actually can be measured.
06:50There is a misconception among companies that metrics like page views or maybe downloads,
06:59those can be used, but this does not correlate with customer satisfaction.
07:07So some of the metrics that correlates with customer satisfaction is, say, completion
07:15rate or, you say, page load times, then video quality.
07:23These are the metrics that really matters to the customers, and it's an identification
07:32of the customer perception of the specific user experience that you are providing.
07:38Then it's necessary for a company to have a holistic view of user experience, that too
07:46with qualitative and quantitative data.
07:49That has to be built.
07:51Often what we see is that the qualitative data, what it comprises
08:05of is the perceived user experience.
08:07For example, a screen might have a lot of blank spaces.
08:13This might not be like it is very difficult to quantify these kind of metrics.
08:21Then issues like not all the UI elements getting loaded or taking a lot of time for a specific
08:31element on a screen to get loaded or be it an image or it will be an animation.
08:39So that is basically what qualitative data is.
08:44Quantitative data would cover the memory used, CPU usage, battery usage across a specific
08:49time.
08:50So these are the metrics that can be actually measured.
08:55It can be directly measured from the mobile device or the specific device from which it's
09:00being tested on.
09:01So integration of these user experience data into a centralized system and viewing all
09:08of these data in a dashboard that would be easier for anybody to consume and then to
09:17make sense out of the data and do a root cause analysis accordingly.
09:23So that is the basic path of action that needs to be followed.
09:32Then once all of these metrics are captured, it needs to be benchmarked across various
09:38competitors.
09:42Say that ShutterRap has done all of these.
09:47They have identified the right set of metrics.
09:49They have created a holistic view of their user experience.
09:52And with that, if they emerge into a market, there is a probable chance that lagging behind
10:01some of their key competitors.
10:03Nowadays, when we talk about a specific territory, apart from the large players that are in the
10:10picture, there would be local players as well in each of that are specific to each of these
10:15markets.
10:16So it is very necessary to benchmark your performance to the performance of competing
10:24applications.
10:26Talking about Shutter, if they want to scale to India, they would have to compare their
10:32performance across the ones that are present here.
10:35It would include Netflix, Prime, Disney, then Sony, Live, Zee5, such channels.
10:43Once all of these metrics are compared, Shutter would be able to identify where, like benchmark
10:55their performance against the competing tools, and then they can do a comparison, do a root
11:04cause analysis of where they are lagging and Headspin itself can provide insights on how
11:12this can be improved.
11:14Now talking about continuous monitoring of these applications, once all of these metrics
11:21are captured, it is very necessary for a company to collect this data at a regular basis.
11:31Once this is captured and it's left behind, then there is no specific use of it.
11:37So it's always necessary to implement continuous user experience monitoring with regular check-ins
11:43and feedback loops and analyze the trend, how or where we are when compared with the
11:51competitors and then update or modify the app accordingly.
11:57Then obviously the benefits of doing such activities would be like there will be a lot
12:05of business benefits with it.
12:06One is the obviously lower customer churn.
12:12There would be higher market share for such applications.
12:15If you see, if you think about Netflix, they offer one of the best customer user experiences
12:22when compared to all of their competitors.
12:25If you see the page load times, everything is every, the user experience part of Netflix
12:31is very seamless.
12:34So yeah, that basically adds to the benefit, right?
12:38Whenever there is content streaming on multiple channels, clients, most of them would prefer
12:44to go with Netflix since there is a better experience on that specific platform.
12:51So offering such good user experiences will create a positive brand perception and the
13:00overall customer lifetime value from such applications will be very high.
13:06You can reduce your costs.
13:09You will have better efficiencies when compared to other platforms.
13:14Now talking about Headspin's benchmarking solution.
13:18So the right side, what you see over here is a basic snippet of how peer benchmarking
13:29solution works for Headspin.
13:31There would be a target application that it could be, say, if you're talking about
13:38Shudder app, Shudder app is the target application, then peer 1, peer 2, peer 3, and peer 4 could
13:43be any other competing applications that they are trying to look into.
13:52So it could be Amazon, Netflix, et cetera.
13:57So the screenshots that you see here on the bottom is a part of Headspin's benchmarking
14:03activity that we carried out for a specific social media company.
14:12So how we deliver this solution is like Headspin solution, what's in it for the clients is
14:23that we help clients evaluate the competitive performance.
14:30Then we ourselves do monitoring of their business-relevant KPIs.
14:36We capture these KPIs for the client and we help them continuously monitor it.
14:42Then Headspin AI sessions, AI-based engine, they capture the issues across these sessions
14:50that are run.
14:52And it also provides AI-based recommendation on how you can improve these issues.
15:02So talking about industry-specific impacts, our solution can be modified according to
15:15various industries that we are working with.
15:21So for a retail company, for them, the KPIs that are relevant or the basic user flows
15:28that are important would be different when compared with some other industry.
15:35So for a basic retail application, the major business flow or major user flow that is important
15:44is for a customer to particularly buy a product.
15:49So the overall responsiveness of the application is the most important user journey or maybe
15:55the metric for measuring performance for a retail or an e-commerce application.
16:03So what we do is we proactively help them identify and provide insights to fix issues.
16:12Like for them, the relevant issues are low page content, loading animations.
16:18And we also help them do a root cause analysis and to fix the time to launch or open these
16:29specific pages for these applications.
16:33Then coming to our solution on media and OTT platforms.
16:40So there, what we do is the overall performance of the applications, mainly focusing on video
16:48streaming quality, like buffering, including audio, video.
16:53What Headspin does is Headspin can capture KPIs like Vmos, that is a reference-free metric
17:01that will help them compare their video streaming performance without any reference, like against
17:09a standard high-quality video stream.
17:15So basically, the perceived user experience of video streaming will be captured through
17:21this Vmos.
17:22Similarly, audio quality, other KPIs that we capture are video load times, search response
17:31time across different platforms, or maybe like you can compare the performance across
17:37locations, devices, networks, et cetera.
17:42So when we say devices, we are not just limited to mobiles and OTT, mobiles and browsers,
17:50or even tablets.
17:54We also cover OTT devices like Apple TV, Fire TV, Android TV, smart TVs.
18:01We also cover DRM-protected content evaluation.
18:07That's with a specific solution called Headspin AV solution.
18:11Then yeah, talking about telco.
18:16So for telco, we have a very wide range of solution, like benchmarking can be done to
18:27compare the quality of your services across locations, across devices, et cetera.
18:37So basically, it helps validate the call quality, network quality.
18:42Then yeah, testing under roaming conditions, or maybe to compare the performance across
18:524G and 5G networks, then testing under network throttling conditions, et cetera.
19:00So what we help the telcom companies mainly is when it comes to scalability, right?
19:08Telco companies would need to test across a wide variety of locations.
19:14So we have presence in 50 plus locations that will help them to easily set up devices across
19:21these, like leverage Headspin infrastructure to compare or maybe test the efficiencies
19:28of their products.
19:31Then talking about banking and financial institutions.
19:35For them, the most important metrics is to like evaluate smooth transaction experiences
19:44for their clients.
19:47If you look at India, the major players, I mean, when coming to B2B transaction apps,
19:53it is Google Pay, Phone Pay, Amazon Pay, Paytm, et cetera.
19:59So Headspin can evaluate the performance of a specific app in comparison with these
20:06applications, and then let you know an overall benchmark where your application stands when
20:14compared to these specific applications.
20:18So apart from B2B payments, there are other use cases that are relevant in banking and
20:24financial institutions like QR based use cases, QR code scanning, QR code based scanning
20:33and payment, et cetera.
20:36Similarly, if you look into different banking applications, right, there are major banks
20:45available in India.
20:47So they have various channels, mobiles, internet banking.
20:52So the performance across these platforms also can be compared and an overall picture
20:59of where you stand can be built out.
21:02So similarly for financial institutions that are doing trading, that are into trading,
21:10they can compare the performances before and after a specific time, like opening and closing
21:17time of market.
21:19So during these times, the performances can be evaluated.
21:24Now looking into how we do this.
21:27So basically, it's an array of steps.
21:32First we try to identify the key KPIs and the key user flows that are relevant for our
21:39client.
21:41Then we automate these scripts, automate, I mean, create these scripts for the clients.
21:49And then we also, along with this, we try to identify the KPIs and like the custom dashboards
21:59that the client specifically wants, so that all of these will be created.
22:04And then once all of these are created, and when it comes to creation of automation scripts,
22:11either Headspin does it, Headspin can do it, or if the clients have the infrastructure
22:19to or capabilities to build these, then they can create the automation scripts.
22:24It can be then executed on devices.
22:27So while execution of these tests on Headspin platform, if the KPIs are already identified
22:35and if the dashboards are built, while execution, the data will automatically get dumped onto
22:41the dashboards so that you can continuously, you can monitor the performance of these KPIs
22:48across these dashboards.
22:50So basically, identification of KPIs, creation of automation scripts, and then execution
22:59of these tests on the devices or on the platform.
23:03So once the tests are run for your application versus the competing peer applications, those
23:11will be automatically, like the dashboards will be populated and you can have a view
23:16of the final benchmarking report.
23:21And these benchmark, the frequency of on how or whenever you need these reports, it can
23:27be, it is flexible, like if you want it in like once in a month or maybe once in a week,
23:38such requirements can be met with Headspin.
23:44These are some snippets of few dashboards that we have created in the past.
23:50So first one that you see is the performance or the homepage launch time of a few social
23:59media applications, Facebook hosts, Instagram, Pinterest, and YouTube.
24:05So here if you see, Pinterest is taking the longest time to load, whereas compared to
24:13that, all the other Facebook hosts and YouTube are in similar lines, right?
24:22Similarly, search time across these same applications.
24:29And the third dashboard that you see here is the performance comparison across two different
24:34devices.
24:36One is an iPhone XR and iPhone 6S.
24:40So a medium end device of Apple and a low end device of Apple.
24:47So this can be varied, like it can be created for Android versus iOS or maybe performance
24:55on a browser versus performance on a mobile app that can be created.
25:04Then it can be compared across geographic locations.
25:07And the last one that you see here is download time calculated for across these applications.
25:17Now if interested, like have a demo, book a demo with Headspin, and we can help you
25:26analyze where your position is with your competitors.
25:33So that ends it.
25:38Let me go to the questions.
25:46Okay, so the first question is, how to combine the qualitative and quantitative experience?
26:01How does Headspin provide a combined view, right?
26:06So Headspin captures multiple metrics.
26:11So one of it is called Headspin impact score.
26:16What it does is it is a calculation of the overall impact time across a single session
26:24compared to the overall duration of that specific session.
26:29So that is a metric, a number that will come out, it quantifies the overall user experience
26:39of that particular session.
26:43So that can be considered as a perceived user experience, where you can see it is a combination
26:50of both qualitative and quantitative issues.
26:54So in Headspin, the qualitative issues are captured in form of issue cards that can be
27:01viewed on a time series.
27:04And the results can be obtained on a time series and where you can easily go and find
27:11what is the root cause for that specific issue.
27:14So that's how Headspin does it.
27:18Then the next question says, are the dashboards flexible?
27:23How do you capture custom KPIs?
27:26So basically, Headspin has a feature called labeling and annotation function with which
27:32you can create your custom KPIs.
27:35Say if you want to, for a retail application, buying a product is a particular user experience,
27:44user flow.
27:45So if you want to search, capture a specific KPI, say from the product search until where
27:56the product page is completely loaded, you can create a specific KPI for that, I mean
28:04an annotation for that, it will be labeled and it will be captured across all the sessions.
28:08So that is how it works in Headspin to create custom KPIs.
28:16Next question is, how do you capture DRM protected content?
28:21Can that be compared across multiple platforms?
28:26Yes.
28:28So DRM protected content is captured using a setup called Headspin AV platform.
28:37So how this happens is AV box, there is a, there is an Headspin appliance called AV box
28:46inside which there will be a monitor and a supporting device or maybe a device under
28:53test on which you play the specific DRM protected content and it would be recorded with a support
29:01device.
29:02And on the support device, all these metrics would be captured.
29:07So that is a complex solution that can be easily used for testing of DRM protected content.
29:17It not only covers DRM protected content, it also helps capture device to device interaction,
29:23I mean use cases involving devices to device interaction, like say video conferencing across
29:32two devices, then playing an, like giving audio commands to a specific device using
29:40a smart speaker.
29:41So all these use cases can be tested using this AV box solution.
29:47So the basic DRM protected content testing is happening in Headspin like this.
29:56I think then that is the end of it.
29:59Thank you.
30:00Thanks for everyone.
30:01Thank you.

Recommended