Front image

Redesigning podcast analytics

At Acast I researched and redesigned the Acast Insights platform. In this case study I focus on three initial quick wins: 

  1. Increased mobile traffic from 21% to 42% (+100%) by improvements to mobile responsiveness. 
  2. Bounce rate reduction from 41% to 26% (58%) by improvements to table graph usability.
  3. Increased a page's % of total visits from 1.8% to 5.2% (289%) by improved site navigation.

Background

Acast is the world's leading independent podcast company, helping creators and advertisers of all sizes to reach listeners in the most immersive environment in the world.

Acast Insights helps creators understand what content and marketing resonates with their listeners, as well as how well they perform in terms of revenue.

At Acast, I was responsible for designing the creator tools and mobile listening app. When conducting interviews for the publishing tools I also happened to find issues related to the Insights platform. The platform had not been designed for in a while and I then proposed to research its current state.

insights

Acast Insights.

Internal research

I started out with internal research to understand what we already knew about the product and its users. I interviewed stakeholders and facilitated workshops with colleagues from different parts of the organisation to gather existing user data (e.g. customer support), technical constraints and ideas.

I also explored current Google Analytics data to understand website behaviours, and explored and benchmarked the design of competitive analytics tools and products.

Ideation workshop

Workshops in Miro to gather existing data and ideas.

User research

I used the internal research to inform a user research plan. With no previous user research around the platform, I decided to recruit and interview 6 creators with varied backgrounds to understand their goals, behaviours, problems and needs.

To engage the team in the results of the research, I invited them to listen in during each interview and shared empathy maps and recordings in Slack.

Empathy maps and mapping

Empathy maps and behavioural mapping.

When cross-analysing data between participants I found behavioural patterns. I analysed these further and modelled them into three user personas to better communicate the findings.

Personas

Three user personas.

Prioritisation

After communicating user personas, problems and technical constraints I facilitated prioritisation workshops to understand their effort, impact, and importance. Together we identified three quick wins to focus on first, and a few longer term efforts to start to explore.

Moreover, I facilitated an ideation workshop around the quick wins to gather more details. The problems we decided to tackle first were:

  1. Lack of mobile responsiveness
  2. Unclear site navigation
  3. Table graph component is confusing
Prioritisation workshop

Effort / impact & MoSCoW prioritisation workshops in Miro.

1. How might we improve mobile responsiveness?

Mobile usage accounted for 21% of website traffic, but users expressed difficulties with the platform's lack of mobile responsiveness.

Components like the menu and pagination did not shrink horisontally, requiring users to zoom in and scroll horisontally to see content.  The login page remained zoomed out, and paddings and typography did not scale.

old responsiveness

The menu and login page did not adapt to mobile sizes, causing users having to zoom in on content.

I started wireframing designs and flows to fix problems related to the menu bar. After reviewing directions with other designers and the team, I proceeded detailing designs and prototypes for usability tests.

Wireframes responsivn

Wireframes for menu and pagination.

Responsiveness

Responsive menu, login page and font scales for desktop, tablet and mobile.

2. How might we increase feature findability?

Users expressed a need to understand episode performance over time, and a segment of users used their own excel files for manual analysis. Most users hadn’t discovered there already was a comparison feature in the platform.

The feature appeared hidden last in a long submenu of options with an unclear title, resulting in 1.8% page visits. I made the assumption that a shorter submenu and clearer title could help the feature’s discoverability, and mapped out the tabs features and alternative groupings.

I proposed renaming the compare tab to performance and moving it up to the second submenu position. I then grouped similar behaviour and demographics data into an audience tab, reducing the number of total tabs. The assumption was here that a clearer title with a more prominent position would drive more traffic.


Site navigation

Old submenu on top and new structure below.

3. How might we improve table graph usability?

Apart from not finding the comparison tab, users had difficulty in understanding how to use the table graph component that displayed the comparisons. The component caused similar usability issues across the platform as it was used for comparing different types of data, and we prioritised improving it as it would benefit the platform and design system as a whole.

Users had trouble finding the dropdown to add items to compare. Another issue was that each item had to be added each time manually from a long list. Other table graph components contained unknown content, such as various app and website names, which made it difficult to know which ones to add.

Acast insights_ selection problem

The dropdown was missed by users, and they couldn't search or easily overview episodes.

I explored various concepts, wireframes and focused on one where all episodes could be displayed directly with a default plot selection for the graph. With this design users could directly overview and sort for which episodes would be interesting to plot in the graph.

I also introduced a search bar to help creators with lots of content, and a percentage to median column for a quick performance overview between episodes. The pre-plotted selection of episodes was based on the new column.

Wiireframes tablegraph

Displaying all items directly checkboxes for plotting.

Performance

New table graph design for episode performance.

Usability testing

To support testing of the designs and user flows I built two prototypes for desktop and mobile using Figma. I merged the prototypes with previous prototypes I built for added context. I typically prototype flows throughout the process as a way to make sense of them in combination with the designs. At this stage I could therefore reuse many interactions and assets for a more realistic experience.

I then conducted 5 usability tests with creators and adjusted the designs based on the results.

prototype

Prototypes in Figma.

Implementation

I collaborated closely with the developer throughout implementation, adjusting designs along the way for technical constraints and effort. He was familiar with Figma at this point and could easily inspect my pages, and we often had brief syncs. 

unsplash-image
Front image

Design and implementation.

Results

Looking at the results in Google analytics after implementation, the:

  1. Improved responsiveness resulted in an increase of mobile traffic from 21% to 42% (+100%).
  2. Usability improvements to the compare table graph reduced bounce rates on the tab from 41% to 26% (58%)
  3. Renaming the compare tab to performance and moving it to the second position increased page visits from 1.8% to 5.2% (+289%)
unsplash-image