UX Case Study

// LINKEDIN USABILITY EVALUATION

LinkedIn Usability Study: Desktop vs. Mobile Experience

Team: Tijana Mijatović, Angelina Hess, Diana Ivanova

My Role: Test Moderation, Technical Setup, Data Analysis & Visualization (R Studio)

Methods: Heuristic Evaluation, Counter-Balanced User Testing, SUS & SEQ Questionnaires

Overview

This research project evaluated the usability of LinkedIn for its key demographic: university students and recent graduates looking for employment. Our goal was to assess how effectively users could navigate core professional tasks—specifically job searching, managing saved jobs, and interacting with networking events—across both desktop and mobile platforms.

Methodology

We conducted a comprehensive usability study combining heuristic evaluation (based on Nielsen's 10 usability heuristics) with a counter-balanced user study involving 10 participants, 80% of whom were students.

  • Testing Environment: We utilized a within-subject design where participants completed tasks on both mobile and desktop platforms. To prevent bias from existing networks, we used a clean, newly created LinkedIn profile for all tests.
  • Tasks: Users were asked to 1) Find and apply for a job, 2) Save a job and navigate back to it without using pop-up shortcuts, and 3) Find, attend, and leave a networking event without using the search bar.

Key Findings & Usability Issues

The study revealed a stark contrast between platform experiences, with the desktop version scoring a "Good" System Usability Scale (SUS) score of 72.8, while the mobile app scored a "Poor" 54.0.

Inconsistent Labeling (Saved Jobs)

Users struggled to locate saved jobs due to inconsistent naming conventions ("Saved" vs. "My Jobs"). Many mistakenly assumed the feature would be located in their personal profile section.

Hidden Features (Events)

The most significant friction occurred when trying to locate events. On mobile, users relied heavily on trial-and-error scrolling or checking "My Network" because the feature's pathway was unclear.

Lack of User Control

Leaving an event proved incredibly difficult, as the option was tucked away behind a three-dot menu. Mobile users struggled heavily with this, leading to widespread confusion and task failure.

My Contribution

For this study, I played a central role in the execution and quantitative analysis phases.

  • Research & Moderation: I established the technical setup for reliable screen recording and data collection, set up the clean testing profiles, and personally moderated 6 out of the 10 user sessions.
  • Data Analysis & Visualization: I took charge of processing the quantitative results from the Single Ease Question (SEQ) metric. Using R Studio, I translated subjective participant feedback into measurable insights and created data visualizations to clearly communicate our usability findings.