Experimentation Case Study

When 95% Completion Didn't Move the Needle: Diagnosing Experiment Flaws

A product tour experiment for Jira Software, a project management tool, increased signups by 10%—but only 2.4% completed it. I designed a new interactive board experience to improve completion. We achieved 95% completion, yet saw no signup impact. Through systematic diagnosis, we identified four critical methodology flaws that would inform our next experiment and drive a 66% increase in conversions.

Experiment Lead

A/B Testing

B2B SaaS

Experimentation Methodology

Interactive Components

Product

JSW Product Tour

What I Did

Design, Testing
& Research

Role

Product Designer

Timeline

6 Weeks

Experimentation Case Study

When 95% Completion Didn't Move the Needle: Diagnosing Experiment Flaws

A product tour experiment for Jira Software, a project management tool, increased signups by 10%—but only 2.4% completed it. I designed a new interactive board experience to improve completion. We achieved 95% completion, yet saw no signup impact. Through systematic diagnosis, we identified four critical methodology flaws that would inform our next experiment and drive a 66% increase in conversions.

Experiment Lead

A/B Testing

B2B SaaS

Interactive Components

Experimentation Methodology

Product

JSW Product Tour

What I Did

Design, Testing & Diagnosis

Role

Product Designer

Timeline

6 Weeks

The Context

The Paradox

Our Hypothesis

We were converting users without educating them about the product's value—potentially impacting activation and retention. An interactive board with step-by-step guidance would improve completion rates and lead to more qualified signups.

Goal: Design an interactive board experience that increases completion from 2.4% to 90%+ while driving qualified signups.

The previous interactive timeline experiment showed promising results with a 10% signup increase. Leadership celebrated the win. However, deeper analysis revealed a critical issue: only 2.4% of 31,072 visitors completed the entire modal experience.

The data showed a massive 95.5% drop-off between step one (updating an epic) and step two (adding a story). We were converting users without educating them about the product's value—potentially impacting long-term activation and retention.

"Overall positive results for interactive demo with high engagement towards the first action of the demo and 2.4% of visitors finishing entire modal experience."

— Previous Experiment Results

Previous Experiment Data

Total Visitors

31,072

Step 1: Update Epic

30,394 (97.8%)

Step 2: Add Story

1,379 (4.4%)

Completed Modal

740 (2.4%)

Drop-off Rate

95.5%

Digging Deeper

Design Process

01

Discover

Analyzed previous experiment data, identified 95.5% drop-off between updating epic and adding story

03

Design

Created interactive board with drag-and-drop cards, editable fields, step-by-step guidance (1/7-7/7)

04

Test

Shipped to production, monitored 95% completion rate and signup impact

02

Research

Analyzed SaaS onboarding patterns: guided tours, progressive disclosure, interactive demos

What I Designed
& Why

Step-by-step guidance

Visual markers (1/7, 2/7) + action-driven copy guide users through each step, reducing confusion

Create

Restart demo

TO DO

Design AI shopping suggestions for homepage

Features

TASK

IN PROGRESS

Create AI-generated shopping suggestions for homepage

Features

STORY

Add advanced analytics tracking events

Analytics

STORY

DONE

Improve payment checkout time on mobile

Payments

BUG

Define requirements to use new AI integrations

Features

TASK

Interactive components

Drag-and-drop cards and editable fields extend Atlassian's design system with interactive patterns

Realistic but simplified

Authentic Jira UI patterns with pre-filled content to reduce cognitive load and speed up completion

These design decisions aimed to increase completion from 2.4% to 90%+ while maintaining design system consistency.

Cross-Functional Collaboration

Led design direction while partnering with PM, Engineering, Content Design, and Data Science to diagnose methodology flaws and ensure rigorous experiment setup.

Product Manager

My Role: Presented data analysis showing 95% drop-off, proposed interactive board solution

Their Role: Defined success metrics, prioritized goals in alignment workshop, secured stakeholder
buy-in

Engineering

My Role: Created detailed interaction specs, design system documentation, collaborated on technical constraints

Their Role: Built interactive components, implemented A/B test infrastructure, set up exposure tracking

Content Design

My Role: Designed step markers, microcopy placement, and information architecture

Their Role: Wrote action-driven copy for each step, ensured clarity and brand voice consistency

This cross-functional approach ensured the experiment had both strong UX design and sound methodology—critical for generating actionable insights.

Data Science

My Role: Requested specific analytics on drop-off points and completion funnels

Their Role: Analyzed previous experiment data, calculated statistical significance, validated methodology

Testing Approach

The Experiment

Launched an A/B test targeting all desktop users visiting the English product tour page. The interactive board component replaced static "Plan" tab content in the treatment group (50% of traffic), with the goal of increasing modal completion from 2.4% to 90%+ and driving more signups.

Control Group (50%)

Standard product tour with static content

Product tour tabs (Plan, Track, Release...) with static screenshots

Text descriptions explaining board features

No hands-on interaction or step guidance

Treatment Group (50%)

Interactive board with step-by-step guidance

Drag-and-drop cards between TO DO, IN PROGRESS, DONE columns

Editable fields with pre-filled example epics

Step markers (1/7, 2/7...) with action-driven microcopy

Live Testing

Shipped directly to production as an A/B experiment. Partnered with Engineering to productionize the interactive components, then monitored engagement metrics—clicks, completions, and drop-off rates—to validate the design and surface insights.

The components built for this experiment were designed to be production-ready. While this experiment revealed methodology flaws, the components were later reused in the Google Ads redesign, which drove a 66% signup increase.

Outcome

Results

UX Success

95%

95% completion rate among users who saw the board (up from 2.4%). The design worked—the problem was visibility and targeting.

Business Metrics

+0.6%

+0.6% signup increase (not statistically significant). The design worked—95% completed it. The experiment methodology failed.

Four Critical Flaws in Our Experiment Design

01

Wrong Audience

30% of participants were logged-in users who already had Jira Software—diluting results.

Fix: Target only new-to-JSW users for accurate signup impact measurement

02

Below-the-
Fold Placement

Only 37% of users scrolled far enough to see the interactive board—limiting impact.

Fix: Position key interactions above fold or track exposure on scroll

03

Exposure Tracking Error

Fired on page load, not when users actually saw the demo—counting people who never engaged.

Fix: Trigger exposure only when treatment visible in viewport

04

Platform Bug

Board styling flashed between dark/light mode due to a platform update—disrupting the user experience.

Fix: Rigorous QA and stable technical environment before launch

What This Taught Me

01

Completion ≠ Conversion

Users engaging with features doesn't automatically drive business metrics. Both matter, but they measure different aspects of success.

02

Methodology = Design Quality

Even well-designed experiences need proper targeting, exposure tracking, and technical stability to prove business impact.

03

Question "Winning" Metrics

The previous experiment's 10% signup increase masked a 95.5% drop-off rate. Always dig deeper than primary metrics.

04

Failed Experiments = Learning

This established best practices for targeting, exposure tracking, and methodology that drove a 66% win in the next experiment.

What Happened Next

Applied these four fixes immediately to Google Ads landing pages, turning methodology learnings into measurable results.

+66%

Completed Signups

+69.7%

Account Creations Starts

Statsig

All Metrics

This proved that experiment methodology is as critical as design quality for driving business results.

MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS

Experimentation Case Study

When 95% Completion Didn't Move the Needle: Diagnosing Experiment Flaws

A product tour experiment for Jira Software, a project management tool, increased signups by 10%—but only 2.4% completed it. I designed a new interactive board experience to improve completion. We achieved 95% completion, yet saw no signup impact. Through systematic diagnosis, we identified four critical methodology flaws that would inform our next experiment and drive a 66% increase in conversions.

Experiment Lead

A/B Testing

B2B SaaS

Experimentation Methodology

Interactive Components

Product

JSW Product

Tour

What I Did

Design, Testing

& Diagnosis

Role

Product

Designer

Timeline

6 Weeks

Context

The Paradox

The previous interactive timeline experiment showed promising results with a 10% signup increase. Leadership celebrated the win. However, deeper analysis revealed a critical issue: only 2.4% of 31,072 visitors completed the entire modal experience.

The data showed a massive 95.5% drop-off between step one (updating an epic) and step two (adding a story). We were converting users without educating them about the product's value—potentially impacting long-term activation and retention.

"Overall positive results for interactive demo with high engagement towards the first action of the demo and 2.4% of visitors finishing entire modal experience."

— Initial Experiment Results

Previous Experiment Data

Total Visitors

31,072

Step 1: Update Epic

30,394 (97.8%)

Step 2: Add Story

1,379 (4.4%)

Completed Modal

740 (2.4%)

Drop-off Rate

95.5%

Hypothesis

Goal: Design an interactive board experience that increases completion from 2.4% to 90%+ while driving qualified signups.

We were converting users without educating them about the product's value—potentially impacting activation and retention. An interactive board with step-by-step guidance would improve completion rates and lead to more qualified signups.

My Approach

Design Process

01

Discover

Analyzed previous experiment data, identified 95.5% drop-off between updating epic and adding story

03

Design

Created interactive board with drag-and-drop cards, editable fields, step-by-step guidance (1/7-7/7)

02

Research

Analyzed SaaS onboarding patterns: guided tours, progressive disclosure, interactive demos

04

Test

Shipped to production, monitored 95% completion rate and signup impact

What I Designed & Why

Step-by-step guidance

Visual markers (1/7, 2/7) + action-driven copy guide users through each step, reducing confusion

Create

Restart demo

TO DO

Design AI shopping suggestions for homepage

Features

TASK

IN PROGRESS

Create AI-generated shopping suggestions for homepage

Features

STORY

Add advanced analytics tracking events

Analytics

STORY

DONE

Improve payment checkout time on mobile

Payments

BUG

Define requirements to use new AI integrations

Features

TASK

Interactive components

Drag-and-drop cards and editable fields extend Atlassian's design system with interactive patterns

Realistic but simplified

Authentic Jira UI patterns with pre-filled content to reduce cognitive load and speed up completion

These design decisions aimed to increase completion from 2.4% to 90%+ while maintaining design system consistency.

Cross-Functional Collaboration

Led design direction while partnering with PM, Engineering, Content Design, and Data Science to diagnose methodology flaws and ensure rigorous experiment setup.

Product Manager

My Role: Presented data analysis showing 95% drop-off, proposed interactive board solution

Their Role: Defined success metrics, prioritized goals in alignment workshop, secured stakeholder buy-in

Engineering

My Role: Created detailed interaction specs, design system documentation, collaborated on technical constraints

Their Role: Built interactive components, implemented A/B test infrastructure, set up exposure tracking

Content Design

My Role: Designed step markers, microcopy placement, and information architecture

Their Role: Wrote action-driven copy for each step, ensured clarity and brand voice consistency

Data Science

My Role: Requested specific analytics on drop-off points and completion funnels

Their Role: Analyzed previous experiment data, calculated statistical significance, validated methodology

This cross-functional approach ensured the experiment had both strong UX design and sound methodology—critical for generating actionable insights.

Testing Approach

The Experiment

Launched an A/B test targeting all desktop users visiting the English product tour page. The interactive board component replaced static "Plan" tab content in the treatment group (50% of traffic), with the goal of increasing modal completion from 2.4% to 90%+ and driving more signups.

Control Group (50%)

Standard product tour with static content

Product tour tabs (Plan, Track, Release...) with static screenshots

Text descriptions explaining board features

No hands-on interaction or step guidance

Treatment Group (50%)

Interactive board with step-by-step guidance

Drag-and-drop cards between TO DO, IN PROGRESS, DONE columns

Editable fields with pre-filled example epics

Step markers (1/7, 2/7...) with action-driven microcopy

Live Testing

Shipped directly to production as an A/B experiment. Partnered with Engineering to productionize the interactive components, then monitored engagement metrics—clicks, completions, and drop-off rates—to validate the design and surface insights.

The components built for this experiment were designed to be production-ready. While this experiment revealed methodology flaws, the components were later reused in the Google Ads redesign, which drove a 66% signup increase.

Outcome

Results

UX Success

95%

95% completion rate among users who saw the board (up from 2.4%). The design worked—the problem was visibility and targeting.

Business Metrics

+0.6%

+0.6% signup increase (not statistically significant). The design worked—95% completed it. The experiment methodology failed.

Four Critical Flaws in Our Experiment Design

01

Wrong Audience

30% of participants were logged-in users who already had Jira Software—diluting results.

Fix: Target only new-to-JSW users for accurate signup impact measurement

02

Below-the-Fold Placement

Only 37% of users scrolled far enough to see the interactive board—limiting impact.

Fix: Position key interactions above fold or track exposure on scroll

03

Exposure Tracking Error

Fired on page load, not when users actually viewed the demo—counting people who never engaged.

Fix: Trigger exposure only when treatment visible in viewport

04

Platform Bug

Board styling flashed between dark/light mode due to a platform update—disrupting the user experience.

Fix: Rigorous QA and stable technical environment before launch

What This Taught Me

01

Completion ≠ Conversion

Users engaging with features doesn't automatically drive business metrics. Both matter, but they measure different aspects of success.

02

Methodology = Design Quality

Even well-designed experiences need proper targeting, exposure tracking, and technical stability to prove business impact.

03

Question "Winning" Metrics

The previous experiment's 10% signup increase masked a 95.5% drop-off rate. Always dig deeper than primary metrics.

04

Failed Experiments = Learning

This established best practices for targeting, exposure tracking, and methodology that drove a 66% win in the next experiment.

What Happened Next

Applied these four fixes immediately to Google Ads landing pages, turning methodology learnings into measurable results.

+66%

Completed Signups

+69.7%

Account Creations Starts

Statsig

All Metrics

Proof that methodology matters as much as design quality for driving results.

Macbook Air

Macbook Air

Context

The Paradox

The previous interactive timeline experiment showed promising results with a 10% signup increase. Leadership celebrated the win. However, deeper analysis revealed a critical issue: only 2.4% of 31,072 visitors completed the entire modal experience.

The data showed a massive 95.5% drop-off between the first interaction (updating an epic) and the second step (adding a story). This meant we were converting users without educating them about the product's value—potentially impacting long-term activation and retention.

"Overall positive results for interactive demo with high engagement towards the first action of the demo and 2.4% of visitors finishing entire modal experience."

— Previous Experiment Results

Previous Experiment Data

Total Visitors

31,072

Step 1: Update Epic

30,394 (97.8%)

Step 2: Add Story

1,379 (4.4%)

Completed Modal

740 (2.4%)

Drop-off Rate

95.5%

Our Hypothesis

We were converting users without educating them about the product's value—potentially impacting activation and retention. An interactive board with step-by-step guidance would improve completion rates and lead to more qualified signups.

Goal: Design an interactive board experience that increases completion from 2.4% to 90%+ while driving qualified signups.

Digging Deeper

Design Process

01

Discover

Analyzed previous experiment data, identified 95.5% drop-off between updating epic and adding story

02

Research

Analyzed SaaS onboarding patterns: guided tours, progressive disclosure, interactive demos

Analyzed SaaS onboarding patterns: guided tours, proprogressive disclosure, interactive demos

03

Design

Created interactive board with drag-and-drop cards, editable fields, step-by-step guidance (1/7-7/7)

04

Test

Shipped to production, monitored 95% completion rate and signup impact

What I Designed & Why

Step-by-step guidance

Visual markers (1/7, 2/7) + action-driven copy guide users through each step, reducing confusion

Create

Restart demo

TO DO

Design AI shopping suggestions for homepage

Features

TASK

IN PROGRESS

Create AI-generated shopping suggestions for homepage

Features

STORY

Add advanced analytics tracking events

Analytics

STORY

DONE

Improve payment checkout time on mobile

Payments

BUG

Define requirements to use new AI integrations

Features

TASK

Interactive components

Drag-and-drop cards and editable fields extend Atlassian's design system with interactive patterns

Drag-and-drop cards, editable fields—extended Atlassian design system

Realistic but simplified

Authentic Jira UI patterns with pre-filled content to reduce cognitive load and speed up completion

Real Jira patterns + pre-filled content to reduce cognitive load

These design decisions aimed to increase completion from 2.4% to 90%+ while maintaining design system consistency.

Cross-Functional Collaboration

Led design direction while partnering with PM, Engineering, Content Design, and Data Science to diagnose methodology flaws and ensure rigorous experiment setup.

Product Manager

My Role: Presented data analysis showing 95% drop-off, proposed interactive board solution

Their Role: Defined success metrics, prioritized goals in alignment workshop, secured stakeholder buy-in

Engineering

My Role: Created detailed interaction specs, design system documentation, collaborated on technical constraints

Their Role: Built interactive components, implemented A/B test infrastructure, set up exposure tracking

Content Design

My Role: Designed step markers, microcopy placement, and information architecture

Their Role: Wrote action-driven copy for each step, ensured clarity and brand
voice consistency

Data Science

My Role: Requested specific analytics on drop-off points and completion funnels

Their Role: Analyzed previous experiment data, calculated statistical significance, validated methodology

This cross-functional approach ensured the experiment had both strong UX design and sound methodology—critical for generating actionable insights.

Testing Approach

The Experiment

Launched an A/B test targeting all desktop users visiting the English product tour page. The interactive board component replaced static "Plan" tab content in the treatment group (50% of traffic), with the goal of increasing modal completion from 2.4% to 90%+ and driving more signups.

Control Group (50%)

Standard product tour with static content

Product tour tabs (Plan, Track, Release...) with static screenshots

Text descriptions explaining board features

No hands-on interaction or step guidance

Treatment Group (50%)

Interactive board with step-by-step guidance

Drag-and-drop cards between TO DO, IN PROGRESS, DONE columns

Editable fields with pre-filled example epics

Step markers (1/7, 2/7...) with action-driven microcopy

Live Testing

Shipped directly to production as an A/B experiment. Partnered with Engineering to productionize the interactive components, then monitored engagement metrics—clicks, completions, and drop-off rates—to validate the design and surface insights.

The components built for this experiment were designed to be production-ready. While this experiment revealed methodology flaws, the components were later reused in the Google Ads redesign, which drove a 66% signup increase.

Outcome

Results

UX Success

95%

95% completion rate among users who saw the board (up from 2.4%). The design worked—the problem was visibility and targeting.

Business Metrics

+0.6%

+0.6% signup increase (not statistically significant). The design worked—95% completed it. The experiment methodology failed.

Four Critical Flaws in Our Experiment Design

01

Wrong Audience

30% of participants were logged-in users who already had Jira Software—diluting results.

Fix: Target only new-to-JSW users for accurate signup impact measurement

02

Below-the-Fold Placement

Only 37% of users scrolled far enough to see the interactive board—limiting impact.

Fix: Position key interactions above fold or track exposure on scroll

03

Exposure Tracking Error

Fired on page load, not when users actually saw the demo—counting people who never engaged.

Fix: Trigger exposure only when treatment visible in viewport

04

Platform Bug

Board styling flashed between dark/light mode due to a platform update—disrupting the user experience.

Fix: Rigorous QA and stable technical environment before launch

What This Taught Me

01

Completion ≠ Conversion

Users engaging with features doesn't automatically drive business metrics. Both matter, but they measure different aspects of success.

02

Methodology = Design Quality

Even well-designed experiences need proper targeting, exposure tracking, and technical stability to prove business impact.

03

Question "Winning" Metrics

The 10% signup increase masked a 95% engagement failure. Always dig deeper than primary metrics.

04

Failed Experiments = Learning

This established best practices for targeting, positioning, and tracking that would drive a 66% win in the next experiment.

What This Taught Me

01

Completion ≠ Conversion

Users engaging with features doesn't automatically drive business metrics. Both matter, but they measure different aspects of success.

02

Methodology = Design Quality

Even well-designed experiences need proper targeting, exposure tracking, and technical stability to prove business impact.

03

Question "Winning" Metrics

The previous experiment's 10% signup increase masked a 95.5% drop-off rate. Always dig deeper than primary metrics.

04

Failed Experiments = Learning

This established best practices for targeting, positioning, and tracking that would drive a 66% win in the next experiment.

What Happened Next

These weren't theoretical recommendations—I immediately applied all four fixes to redesign Google Ads landing pages for paid traffic.

+66%

Signup Increase

+69.7%

Account Creations

Statsig

All Metrics

This proved that experiment methodology is as critical as design quality for driving business results.

What Happened Next

Applied these four fixes immediately to Google Ads landing pages, turning methodology learnings
into measurable results.

+66%

Completed Signups

+69.7%

Account Creations Starts

Statsig

All Metrics

Proof that methodology matters as much as design quality for driving results.

MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS
MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS MORE WORKS

bildkritik

Go Back To Top

bildkritik

Go Back To Top