OPPORTUNITY STATEMENT
FEEDBACK - Content creators don’t have a scalable way to gather readers’ sentiment on their Confluence pages. They run the risk of repeating mistakes or missing opportunities to improve the content. We saw a gap in the market to address this need especially as Atlassian Server’s sunset quickly approaches.
Role: UX/UI Designer, product direction, User research, facilitator, rapid prototyping & user testing
Time frame: 4 weeks
Outcomes: 60+ installs
Goal.
We adapted our process to be more agile, aiming to develop an MVP of the app in four weeks and gather user feedback, a departure from our usual method. To refine the scope, we used the MMF framework, alongside user research, to define the scope and leveraged Atlassian's design system for UX and development.
Understand. Build. Test. Reiterate.
Analysis & research
Competitive research
Research was conducted through a series of surveys sent to our two personas - The Confluence content writer for a company and the users who absorb this content.
We asked a series of questions about how each side uses Confluences pages to understand what feedback was required and how it would impact both sides.
Market research was conducted through a series of competitive analysis to understand the existing apps of a similar nature in Atlassian marketplace and where the gap was.
Using the MMF Framework
We used the Minimum Marketable Feature (MMF) framework alongside the research report to understand the key opportunities, impact and risk of the idea.
This left us with some actionable solutions and features to focus on, which then were converted into Stories for Designers and Developers to work in tandem.
Requirements
I owned the design responsibility, and on top of conducting user research that ran parallel to our sprints as the developers spiked on feasibility, I researched sentiment gathering experiences and dashboard displays. From the research, a few key trends stood out.
The method of feedback collection - simple, to the point
Targeting and segmentation - targeting specific visitors or customer segments - we didn’t go with this for this iteration
Survey design and customisation - leaning more towards a survey or a form builder. We could have customisations, but we didn’t want to focus on being a survey builder
Reporting and analytics - being able to collect quantifiable data, to visualise the results and analyse feedback so that it is an insight and the data can be used. Have trends and help businesses identify patterns, make informed decisions and measure customer satisfaction
But not everything could be done in our MVP. Instead, I chose to focus on the two that were simultaneously the most frequently mentioned, with the highest impact.
And the results are…
Method of collection
The method of feedback collection ranged from targeted surveys, persistent pop-ups to macros within Confluence. The experience needed to:
1. Not be be too intrusive to the user,
2. Still be prominent enough to capture attention
3. Not fight for real estate amongst other apps
4. Be simple, reversible and empowering
We settled with using 3 emojis/faces to indicate sentiment, followed by a dropdown list of common comments users tend to leave on content writing.
Byline
After some research and considering our app's requirements, we decided the byline was the ideal spot. It offered good visibility and potential for future enhancements. But the space was limited, which meant users had to go through multiple pages, selecting an emoji first before reaching the dropdown page.
Dashboard
For the content writer, the dashboard needed to
1. Be able to collect quantifiable data
2. Allow user to visualise results and analyse feedback so data can be used
3. Show trends and help them identify patterns
4. Empower them to make informed decisions
5. Measure customer satisfaction
Dash (v2)
The importance of the Dashboard was to provide immediate value for our users. This meant it had to show them actionable insights, key information all in an intuitive manner.
To remove the overuse of numbers and data, instead a candy bar design was opted for to give a better visual representation of overall sentiment.
IMPACT - Page Feedback for Confluence Cloud was officially launched on the Atlassian Marketplace in September 2023. I assisted in creating the marketing material including the logo and listing application as part of our Labs initiative.
The purpose of Labs is to push an MVP out to customers as fast as possible, and to use the user research we get from that to future drive our decisions.
Retrospective
As the first app our squad developed within the “Labs” program, as well as with Atlassian Forge, I’m proud of us for being able to deliver an app of this calibre within only 4 weeks. The biggest challenge was to prevent scope creep. Given that we had a much shorter timeline than usual, we had to be very strict with what specific features we could achieve, and ensure they lined up with an intuitive user experience. It showed me an even more agile method of working - and where I thought we were working pretty fast before, realised there was still room to grow.
That being said, I can see a lot of promising potential for Page Feedback in the future. We built with scalability in mind, using semantic code and the new Atlassian token design system. I learnt from it the value of daily rituals to keep in touch with the engineers, and it also helped me let go of perfectionism and trust the users’ reactions more.