A British broadcaster decided to launch a trial customer portal as part of a new self-serve offering. The application would allow businesses to purchase ad space on digital platforms.
I was tasked to lead the UX, working alongside a product manager, product owner, market researcher, visual designer and developers.
Our goal was to bring the product to market quickly and evolve the understanding of the market need with an initial Alpha trial.
- Deliver the trial application
- Gain insights from participants using the real application
- Improve the application as a result of participant feedback
Here, I will talk about the Alpha release. Since then, we have also designed and launched the Beta trial, which is currently available to public.
We had a total of 10 weeks to design and build and launch the product to a selected customer base. We were going to follow a learning focused approach to validate and optimise the proposition.
Where do you start, when faced with a blank canvas?
Completely new product, new team and ambitious deadlines. It was the first time my agency took on a development work this scale. We had to figure out a new way of working together. I was the bridge in between the product team on the client side and the development team on the agency side, which meant there was a lot of expectation management to do.
1- Understanding the domain
I started with studying direct and in-direct competitor products so that I could:
- Build domain awareness
- Understand the building blocks of a self serve advertising platform
- Collect a list of features we may introduce at different stages
- Consider how our proposition and model stands against the competition
Usability test planning
We wanted to get the designs and proposition in front of users quickly and continuously iterate based on our learnings. Every 2 weeks I would be conducting 4 usability testing sessions, with a total of 16 participants throughout the project. This also meant that I had to get a prototype in front of the users 3 weeks from kick-off.
I defined priorities carefully and got stakeholder buy-in on design sprint planning.
Through discussions with product team and desk research, I had a rough picture of the parts that was going to make up our Alpha application. I proposed to break the design delivery into the following bi-weekly sprints:
- First sprint: Core product. This was the campaign creation piece. This would allow us to start validating the “must have” part of the product from very early on, and allow plenty of time to refine and iterate.
- Second sprint: Landing page and account registration. This was a “must have” too, but if the development timelines slipped we could also support these through a concierge model.
- Third sprint: Integration to Help centre and TBD features. These were “should haves”. We planned to launch the product at this stage (week 6-8).
- Final sprints: Improvements and “nice to have” features based on user and business feedback.
With this plan, if the launch date slipped for any reason, we could still continue our learnings and iterations in a controlled test environment.
3- Architecting the journey
While the requirements were still to-be-clarified throughout the project, I needed a place to start.
I prepared variations of state diagrams and user flows in order to kick off discussions with the team and start making decisions on a direction.
Due to the nature of the service, user request had to go through different levels of approval. This would require the user to leave the journey and return to the system later to complete their request. This was also complicating the flow and we had tight deadlines. So I simplified the flow by pushing all approvals after request submission where it’s no longer a concern for the user.
4- Explore variations
In the early days, many of the requirements were undefined so I started with some assumptions. I wireframed variations of the campaign creation journey in Sketch. Even when they are not accurate, wireframes help spark conversations and lead to more productive stakeholder meetings.
We decided to go with the option that was quicker to build. ‘Optimisations’ could come later when we have the core product running and have collected user feedback.
PROTOTYPE, TEST, ITERATE
When starting with a blank canvas, everything seems possible and there are lots of ideas coming from all directions. Add to that a 6 week launch deadline and very limited dev resource. It was crucial to focus on a simple solution that could be launched in time. That meant some ideas had to be dropped from the beginning, or saved for later.
At first I stitched the wireframes together in Invision as a prototype. The limited capability of Invision when it came to key interactions led me to move onto Axure.
I moderated 4 bi-weekly user testing sessions using prototypes with increasing fidelity.
Step-by-step campaign builder… But which step first?
Two weeks from facing the blank canvas, I had a low-fidelity wireframe prototype of the multi-step campaign creation wizard, ready to validate with users.
User testing #1 top takeaways:
- First time users came to the product with an exploration mindset. They wanted to try out the tool to get a feel for it. Actually submitting a request was going to take more preparation. I re-ordered the steps to give more visibility and control on options that are important to first time users.
- They had questions in mind that needed to be answered while interacting with the product. I incorporated answers of these questions into the designs.
I took the user testing findings into consideration and re-designed the steps to create a campaign.
New information, new flow
At this stage an ambiguous feature started to become more clear.
People would be able to upload their own creative to the campaign, as well as hire a professional to create it for them. Initial understanding was ‘hiring a professional’ would be a core part of campaign creation. But as the business requirements became more clear, I saw that:
- There wasn’t any commitment to ‘hiring a professional’
- This approach was going to limit users’ choices in the long run
- I could see it was going to create extra work for developers when they handle campaign states and edge cases, which was unnecessary for Alpha
The more information I collected the more obvious it became to me that this had to be an add-on rather than the core part of the journey. Users had to leave campaign if they wanted to hire a professional. I sketched out new screen flows and journeys and went back to explain the new solution to the team. It was not only the simplest solution, but also was the only feasible solution.
But the solution that was so obvious to me wasn’t as obvious to the team yet. It required a perspective change and took me several meetings to get buy in from both internal and external stakeholders, until everyone was on-board with the new flow.
Options, pricing and account creation
I took the prototype to next fidelity, refining the campaign options and tying the interactions to a more accurate representation of the pricing model.
User testing #2 top takeaways:
Some targeting features got users excited at first but they quickly realised that they didn’t know which ones to choose to get the best possible outcome. I requested additional data to show relevance and help people make decisions.
- Most users struggled to get their heads around the pricing model. I worked with our visual designer to find a way to communicate the model better.
- People found creating a campaign easy and intuitive. But what they would begetting for their money wasn’t clear and this was causing hesitation. I took the findings back to product team to discuss copy additions and potential features that might support the user’s decision making.
Optimising the designs for development to hit target launch date
It was week 5 and the planned launch date was approaching fast. I pulled in colleagues to do one last round of optimisation based on our dev capacity. We decided to remove the ability to save a ‘Draft’ campaign as it was going to massively reduce the states and scenarios we would otherwise need to handle. From UX perspective it wasn’t going to be a big loss as the campaign form was already short and easy to fill.
Copy and consistency
I was already designing with content from the beginning. In this iteration I focused on refining the copy, adding explanations and tooltips. I made sure we reveal the right information, at the right time, in the right places.
The copy received from the client, at times, was quite lengthy. I reduced, re-wrote, broke up the copy in sections to improve readability and scan-ability. After the UI was built and running, I continued optimising the copy in the code.
User testing #3:
I created a new prototype with the branded designs I took over from the visual designer. It started to feel a lot like the real product and I could observe a change in users reactions to little things didn’t matter to them before, such as the wording used on the sign-up form. Some people described it as ‘scary’, as they felt like there is a commitment to signing up even though they just wanted to explore the product.
I changed wording in the invitation email, landing page and sign-up form with a more inviting tone.
Ensuring our front-end matches the design
Dev was ready to launch the product and UX had reached to a state where most of the focus needed to be on refining and polishing, a quality check, making sure build is aligned with designs. So I started testing, raising and tracking UI issues on Gitlab.
The story is familiar. Devs have their hands full with issues to fix and new components to build. Copy changes and design tweaks get pushed down to the bottom of the backlog and remain there.
So I decided to get my own branch in repository and started making copy / design polish in code.
Launch and more
With the core product up and running, we had 1.5 weeks left to enhance the application. To me there was one clear winner in a list of potential features to develop: advanced targeting. This was going to provide most value to the users as well as the business and it would take the least development time.
I avoided advising complex new features that aren’t achievable within our timeframe and looked for ways to address people’s questions in copy and FAQ’s.
User testing #4:
For the first time we were testing the product rather than a prototype. There weren’t any concerns about the ease of use, and I collected useful feedback about the copy.
To test the new advanced targeting option before build, I ran a paper test and asked users to select the options they would choose.
Based on these insights, I advised the team which options are worth investing time in the few remaining days.
We launched the product on week 8 to a selected group of users. We have received our first orders and more importantly, we also received re-orders – which was our main measure of success.
Alpha Trial completed with success. After writing this case study, we have designed and launched the Beta release, which is currently available to public.