Welcome to the world of competitive video gaming, also known as esports. In esports, like with any sport, “athletes” need to track and analyze their progress in order to level up. Statistics play a big role in this, and serve as the foundation of Tabstats, the gaming stats tracker that I designed.
Tabstats should enable competitive gamers to:
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.
Comparing progress against other gamers can help players set actionable goals to move up the leaderboard.
Data can help players discover their strengths, weaknesses, and patterns in their gameplay to determine specific ways to improve.
Gaming is inherently exciting, and competition triggers the brain’s reward centres even further. Tabstats should serve up stats in a way that keeps this dopamine flowing.
Video games evolve rapidly, meaning video game-based products evolve even faster. To ensure our product reached the market before our ideas were deprecated, leadership often operated on the principle of "good, fast, cheap".
Designing a product at the intersection of these three pillars was tough, and the customer experience was often the first to be sacrificed. That being said, our team worked strategically to achieve all three of “good, fast, and cheap” where possible, but also make tradeoffs where necessary, as you will see in this case study.
Gamers are very vocal online, which made it easy to gather user feedback on existing stats trackers, and understand both the motivations and pains that competitive gamers currently experience with these platforms.
twitch
Observing live chat mid-stream, Twitch
1-on-1 chat with gamer, Discord
discord
User post about stat tracking, Reddit
I supplemented the user research with a competitive analysis, through which I scoured 10+ gaming sites to identify their core features and understand common design patterns that could be leveraged in our product. See example:
Thanks to gamers’ propensity to have in-depth conversations online, and the ease of access to competitor products, we were able to perform research in a manner that was:
Obtained high-quality insights by combining customer and competitor research
Performed processes I was familiar with and could largely complete independently
No need for research software or user incentives (i.e., gift certificates) at this stage
I consider flow maps to be the best form of low-fidelity design. Plotting out the entire Tabstats platform with all the features we wanted to include based on the customer and competitor research helped us scope out the MVP while also planning how the product would grow, making it:
Helped plot our roadmap (a project manager’s kryptonite)
Required an entire sprint to complete
Saved hours of aimless design work that would occur without a defined feature set
The map went through several iterations and reviews with the product owners and developer leads, enabling them to fill in any gaps, offer feasibility-based feedback to determine what to include in our MVP, and visualize how the product could grow sustainably overtime.
Unexepcted Bonus
The flow map also turned into a critical aspect of onboarding, providing a clear visual roadmap of our product for all new hires to quickly get them up to speed.
Based on the product goals that were set at kickoff, and the feedback we got from users about how stats trackers could be improved in terms of their UI/UX, we also crafted a set of design-specific goals to guide our work.
Prioritize the content that is most essential to the user to avoid information overload.
Present the information in a way that is easy to consume and extract meaning from.
Extending the third product goal, we should leverage visual design principles to make the platform content engaging for the user.
Meeting our design and larger product goals required tons of iteration. Here’s an example demonstrating a few of the iterations that our seasonal stats module went through in order to comply with all our goals.
Communicating Effectively: I emphasized the primary seasonal stats (KD, Kills, Deaths, and Abandons) through visual hierarchy.
Getting Excited: The progress bar between the Diamond and Champion ranks visually communicates how close the player is to the next rank to build excitement.
Getting Excited: I added the season cover image to the module for more visual appeal.
Communicating Effectively: I tested out a different layout to avoid having the stats rows run across the entire width of the screen, as it was hurting readability and making the module overwhelming.
Minimizing Clutter: I removed a few non-essential stats from the module based on team feedback.
Getting Excited: Additional game imagery brought the module to life.
Minimizing Clutter: The percentiles and corresponding percentile bars in the top row of primary stats weren’t adding much value, so they were eliminated in favour of a more lean module.
As screens were built and the platform grew, it became clear a design system was needed to resolve inconsistencies and drive operational efficiency. The final system was:
A weak design system would've been disastrous and created more problems than it solved.
It was my first time building a design system from scratch and I took the time that was needed to build it right. This meant researching all the greats (Material Design, Polaris, etc.), creating an organization structure, and a lot of iteration and upkeep.
With myself as the primary designer, we saved money by avoiding hiring an additional designer to manage the design system work, but that meant sacrificing speed.
Our product owners were tempted to forgo user testing in favour of a “release it and see what happens” approach. However, I did convince the team to run virtual tests with a handful of my coworkers’ gamer friends to get feedback on some key design decisions.
Testing with friends meant the results could potentially be biased, and the users may not align perfectly with our target audience. Also, the lack of a UX researcher or formal research software meant the test quality was limited by my capabilities as a designer.
Getting access to and scheduling sessions with these users was fast because they were known contacts. The user tests were also always kept brief, with just a few screens shown at a time.
Given that most users were volunteers, no compensation was required. No formal research software was purchased or used either.
The following is an example of an A/B test we ran with our users.
Prioritizing the design goal of “exciting the user” by emphasizing game colours and branding.
Prioritizing the design goal of “minimizing clutter” using lean, monochromatic modules.
Although the first variation was cited as being more attractive, users felt it was too flashy for a sidebar, and was drawing their attention away from the primary content. The second, more muted variation drew less attention, and could remain in the user’s peripheral vision, but be ignored in their primary line of vision.
Takeaway
Not all of our design and product goals should be weighed equally. In this example, although creating visually appealing designs was important, users prioritized clarity, especially when consuming such a large volume of information.
Minimizing clutter > Getting the user excited
Handoff did not require a heavy lift since I maintained a close relationship with our dev teams, meaning they had insight into the process and designs throughout the course of the project, as opposed to one big final reveal at the end. When deliverables were handed off, nothing came as a surprise.
Devs provided feasibility-based feedback as we went which enabled ongoing iteration as opposed to large chunks of design work being redone at the end
Short, sub-5 minute meetings enabled us to discuss and resolve issues on the go, and operate as an agile team
Fewer long meetings meant more output being produced from our work hours
For extra support, I crafted an interactive prototype of the “happy path” flow in mobile, tablet, and desktop breakpoints enabled our dev teams to get a good “feel” of the product.
This was supplemented with note cards that included any necessary specs, and interaction cards that outlined specific animations that we desired.
Midway through this project, I was promoted to design lead, and tasked with building the product and the design team. Of the many challenges I faced and lessons I learned as a first-time manager, one of the most important was learning to drop the DIY approach.
For example, in their first few weeks on the job, the designer I’d hired made several small errors when designing responsive components. Having made those mistakes myself in the early days and since learned how to correct them, it was easiest for me to just go in and resolve them rather than explaining the error and fix to the designer. This worked out, until they kept making the same mistakes repeatedly.
At that point, I realized it was time to take a backseat and allow the designer to learn by doing. I pointed out the mistake, asked them to resolve it, and was on standby if they needed support.
The first time, it took longer than it would’ve taken me to resolve the error myself. The second time it was faster. There was no third time because the designer had learned the correct process.
Being a good manager meant learning when to be in the trenches, and when to sit back and play the quality control role, in order to optimize the growth of the product AND the team.