Experimentation Case Study

When 95% Completion Didn't Move the Needle: Diagnosing Experiment Flaws

A product tour experiment for Jira Software, a project management tool, increased signups by 10%—but 95% of users abandoned the demo. I designed a new interactive board experience and achieved 95% completion, yet saw no signup impact. Through systematic diagnosis, we identified four critical methodology flaws that would inform our next experiment and drive a 66% increase in conversions.

Experiment Lead

A/B Testing

Interactive Components

B2B SaaS

Project Management Software

Product

JSW Product Tour

What I Did

Design & Research

Role

Product Designer

Timeline

6 Weeks

Experimentation Case Study

When 95% Completion Didn't Move the Needle: Diagnosing Experiment Flaws

A product tour experiment for Jira Software, a project management tool, increased signups by 10%—but 95% of users abandoned the demo. I designed a new interactive board experience and achieved 95% completion, yet saw no signup impact. Through systematic diagnosis, we identified four critical methodology flaws that would inform our next experiment and drive a 66% increase in conversions.

Experiment Lead

A/B Testing

B2B SaaS

Interactive Components

Project Management Software

Product

JSW Product Tour

What I Did

Design & Research

Role

Product Designer

Timeline

6 Weeks

The Context

The Challenge

Our Hypothesis

We were converting users without educating them about the product's value—potentially impacting activation and retention. An interactive board component with better guidance would improve completion rates and lead to more qualified signups.

Goal: Design an interactive board experience that reduces drop-off and increases modal completion rates, ultimately driving more product signups.

The previous interactive timeline experiment showed promising results with a 10% signup increase. Leadership celebrated the win. However, deeper analysis revealed a critical issue: only 2.4% of 31,072 visitors completed the entire modal experience.

The data showed a massive 95.5% drop-off between the first interaction (updating an epic) and the second step (adding a story). This meant we were converting users without educating them about the product's value—potentially impacting long-term activation and retention.

"Overall positive results for interactive demo with high engagement towards the first action of the demo and 2.4% of visitors finishing entire modal experience."

— Previous Experiment Results

Previous Experiment Data

Total Visitors

31,072

Step 1: Update Epic

30,394 (97.8%)

Step 2: Add Story

1,379 (4.4%)

Completed Modal

740 (2.4%)

Drop-off Rate

95.5%

Digging Deeper

Design Process

01

Discover

Analyzed previous experiment data, identified 95.5% drop-off between steps 1-2

03

Design

Created interactive board with drag-and-drop, editable fields, step guidance

04

Test

Shipped to production, monitored completion and signup metrics

02

Research

Analyzed SaaS onboarding patterns: guided tours, progressive disclosure, interactive demos

Cross-Functional Collaboration

Led design direction while partnering closely with multiple teams to ensure technical feasibility, accurate messaging, and rigorous experiment methodology.

Product Manager

My Role: Presented data analysis showing 95% drop-off, proposed interactive board solution

Their Role: Defined success metrics, prioritized goals in alignment workshop, secured stakeholder
buy-in

Engineering

My Role: Created detailed interaction specs, design system documentation, collaborated on technical constraints

Their Role: Built interactive components, implemented A/B test infrastructure, set up exposure tracking

Content Design

My Role: Designed step marker UI, microcopy placement, and content hierarchy strategy

Their Role: Wrote action-driven copy for each step, ensured clarity and brand voice consistency

Data Science

My Role: Requested specific analytics on drop-off points and completion funnels

Their Role: Analyzed previous experiment data, calculated statistical significance, validated methodology

Testing Approach

The Experiment

Launched an A/B test targeting all desktop users visiting the English product tour page. The interactive board component replaced static
"Plan" tab content in the treatment group, with the goal of increasing modal completion rates and driving more signups.

Control Group (50%)

Standard product tour with static images and text describing the board view.

Product tour tabs with static screenshots

Text descriptions of features

No interactive elements

Treatment Group (50%)

Enhanced experience with interactive board, step guidance, and hands-on exploration.

Drag-and-drop cards between columns

Editable fields and pre-filled epics

Step markers with action-driven microcopy

Live Testing

Shipped directly to production as an A/B experiment. Worked closely with Engineering to productionize the interactive components, then monitored engagement data including clicks, completions, and drop-off rates to validate the design and surface insights.

The components built for this experiment were designed to be production-ready, allowing us to quickly roll out successful patterns to other pages if results were positive.

Outcome

Results

UX Success

95%

Of users who saw the interactive board completed all steps—up from 2.4%. This proved the UX design worked as intended.

Business Metrics

+0.6%

No statistically significant increase in signups (±3.1%). But this wasn't a UX failure—it was an experiment design failure.

Four Methodology Flaws That Invalidated Our Results

01

Wrong Audience

30% of participants were logged-in users who already had Jira Software—diluting results.

Fix: Target only new-to-JSW users for accurate signup impact measurement

02

Below-the-
Fold Placement

Only 37% of users scrolled far enough to see the interactive board—limiting impact.

Fix: Position key interactions above fold or track exposure on scroll

03

Exposure Tracking Error

Fired on page load, not when users actually saw the demo—counting people who never engaged.

Fix: Trigger exposure only when treatment visible in viewport

04

Platform Bug

Board styling flashed between dark/light mode due to platform change—impacting user experience.

Fix: Rigorous QA and stable technical environment before launch

What This Taught Me

01

Completion ≠ Conversion

Users engaging with features doesn't automatically drive business metrics. Both matter, but they measure different aspects of success.

02

Methodology = Design Quality

Even well-designed experiences need proper targeting, exposure tracking, and technical stability to prove business impact.

03

Question "Winning" Metrics

A 10% signup increase masked 95% of users abandoning the experience. Always dig deeper than surface metrics.

04

Failed Experiments = Learning

This established best practices for targeting, exposure tracking, and methodology that drove a 66% win in the next experiment.

What Happened Next

These weren't theoretical recommendations—I immediately applied all four fixes to redesign Google Ads landing pages for paid traffic.

+66%

Signup
Increase

+69.7%

Account
Creations

Statsig

All
Metrics

This proved that experiment methodology is as critical as design quality for driving business results.

  • More Works More Works

Experimentation Case Study

When 95% Completion Didn't Move the Needle: Diagnosing Experiment Flaws

A product tour experiment for Jira Software, a project management tool, increased signups by 10%—but 95% of users abandoned the demo. I designed a new interactive board experience and achieved 95% completion, yet saw no signup impact. Through systematic diagnosis, we identified four critical methodology flaws that would inform our next experiment and drive a 66% increase in conversions.

Experiment Lead

A/B Testing

Interactive Components

B2B SaaS

Project Management Software

Product

JSW Product

Tour

What I Did

Design &

Research

Role

Product

Designer

Timeline

6 Weeks

Context

The Paradox

The previous interactive timeline experiment showed promising results with a 10% signup increase. Leadership celebrated the win. However, deeper analysis revealed a critical issue: only 2.4% of 31,072 visitors completed the entire modal experience.

The data showed a massive 95.5% drop-off between the first interaction (updating an epic) and the second step (adding a story). This meant we were converting users without educating them about the product's value—potentially impacting long-term activation and retention.

"Overall positive results for interactive demo with high engagement towards the first action of the demo and 2.4% of visitors finishing entire modal experience."

— Previous Experiment Results

Previous Experiment Data

Total Visitors

31,072

Step 1: Update Epic

30,394 (97.8%)

Step 2: Add Story

1,379 (4.4%)

Completed Modal

740 (2.4%)

Drop-off Rate

95.5%

Our Hypothesis

We were converting users without educating them about the product's value—potentially impacting activation and retention. An interactive board component with better guidance would improve completion rates and lead to more qualified signups.

Goal: Design an interactive board experience that reduces drop-off and increases modal completion rates, ultimately driving more product signups.

Digging Deeper

Design Process

01

Discover

Analyzed previous experiment data, identified 95.5% drop-off between steps 1-2

03

Design

Created interactive board with drag-and-drop, editable fields, step guidance

02

Research

Analyzed SaaS onboarding patterns: guided tours, progressive disclosure, interactive demos

04

Test

Shipped to production, monitored completion and signup metrics

What I Designed & Why

Step-by-step guidance

Visual markers + action-driven copy to reduce confusion about what to do next

Create

Restart demo

TO DO

Design AI shopping suggestions for homepage

Features

TASK

IN PROGRESS

Create AI-generated shopping suggestions for homepage

Features

STORY

Add advanced analytics tracking events

Analytics

STORY

DONE

Improve payment checkout time on mobile

Payments

BUG

Define requirements to use new AI integrations

Features

TASK

Interactive components

Drag-and-drop cards, editable fields—extended Atlassian design system

Realistic but simplified

Real Jira patterns + pre-filled content to reduce cognitive load

Cross-Functional Collaboration

Led design direction while partnering closely with multiple teams to ensure technical feasibility, accurate messaging, and rigorous experiment methodology.

Product Manager

My Role: Presented data analysis showing 95% drop-off, proposed interactive board solution

Their Role: Defined success metrics, prioritized goals in alignment workshop, secured stakeholder buy-in

Engineering

My Role: Created detailed interaction specs, design system documentation, collaborated on technical constraints

Their Role: Built interactive components, implemented A/B test infrastructure, set up exposure tracking

Content Design

My Role: Designed step marker UI, microcopy placement, and content hierarchy strategy

Their Role: Wrote action-driven copy for each step, ensured clarity and brand voice consistency

Data Science

My Role: Requested specific analytics on drop-off points and completion funnels

Their Role: Analyzed previous experiment data, calculated statistical significance, validated methodology

Testing Approach

The Experiment

Launched an A/B test targeting all desktop users visiting the English product tour page. The interactive board component replaced static "Plan" tab content in the treatment group, with the goal of increasing modal completion rates and driving more signups.

Control Group (50%)

Standard product tour with static images and text describing the board view.

Product tour tabs with static screenshots

Text descriptions of features

No interactive elements

Treatment Group (50%)

Enhanced experience with interactive board, step guidance, and hands-on exploration.

Drag-and-drop cards between columns

Editable fields and pre-filled epics

Step markers with action-driven microcopy

Live Testing

Shipped directly to production as an A/B experiment. Worked closely with Engineering to productionize the interactive components, then monitored engagement data including clicks, completions, and drop-off rates to validate the design and surface insights.

The components built for this experiment were designed to be production-ready, allowing us to quickly roll out successful patterns to other pages if results were positive.

Outcome

Results

UX Success

95%

Of users who saw the interactive board completed all steps—up from 2.4%. This proved the UX design worked as intended.

Business Metrics

+0.6%

No statistically significant increase in signups (±3.1%). But this wasn't a UX failure—it was an experiment design failure.

Four Methodology Flaws That Invalidated Our Results

01

Wrong Audience

30% of participants were logged-in users who already had Jira Software—diluting results.

Fix: Target only new-to-JSW users for accurate signup impact measurement

02

Below-the-Fold Placement

Only 37% of users scrolled far enough to see the interactive board—limiting impact.

Fix: Position key interactions above fold or track exposure on scroll

03

Exposure Tracking Error

Fired on page load, not when users actually saw the demo—counting people who never engaged.

Fix: Trigger exposure only when treatment visible in viewport

04

Platform Bug

Board styling flashed between dark/light mode due to platform change—impacting user experience.

Fix: Rigorous QA and stable technical environment before launch

What This Taught Me

01

Completion ≠ Conversion

Users engaging with features doesn't automatically drive business metrics. Both matter, but they measure different aspects of success.

02

Methodology = Design Quality

Even well-designed experiences need proper targeting, exposure tracking, and technical stability to prove business impact.

03

Question "Winning" Metrics

A 10% signup increase masked 95% of users abandoning the experience. Always dig deeper than surface metrics.

04

Failed Experiments = Learning

This established best practices for targeting, exposure tracking, and methodology that drove a 66% win in the next experiment.

What Happened Next

These weren't theoretical recommendations—I immediately applied all four fixes to redesign Google Ads landing pages for paid traffic.

+66%

Signup Increase

+69.7%

Account Creations

Statsig

All Metrics

This proved that experiment methodology is as critical as design quality for driving business results.

Context

The Paradox

The previous interactive timeline experiment showed promising results with a 10% signup increase. Leadership celebrated the win. However, deeper analysis revealed a critical issue: only 2.4% of 31,072 visitors completed the entire modal experience.

The data showed a massive 95.5% drop-off between the first interaction (updating an epic) and the second step (adding a story). This meant we were converting users without educating them about the product's value—potentially impacting long-term activation and retention.

"Overall positive results for interactive demo with high engagement towards the first action of the demo and 2.4% of visitors finishing entire modal experience."

— Previous Experiment Results

Previous Experiment Data

Total Visitors

31,072

Step 1: Update Epic

30,394 (97.8%)

Step 2: Add Story

1,379 (4.4%)

Completed Modal

740 (2.4%)

Drop-off Rate

95.5%

Our Hypothesis

We were converting users without educating them about the product's value—potentially impacting activation and retention. An interactive board component with better guidance would improve completion rates and lead to more qualified signups.

Goal: Design an interactive board experience that reduces drop-off and increases modal completion rates, ultimately driving more product signups.

Digging Deeper

Design Process

01

Discover

Analyzed previous experiment data, identified 95.5% drop-off between steps 1-2

02

Research

Analyzed SaaS onboarding patterns: guided tours, progressive disclosure, interactive demos

Analyzed SaaS onboarding patterns: guided tours, proprogressive disclosure, interactive demos

03

Design

Created interactive board with drag-and-drop, editable fields, step guidance

04

Test

Shipped to production, monitored completion and signup metrics

What I Designed & Why

Step-by-step guidance

Visual markers + action-driven copy to reduce confusion about what to do next

Create

Restart demo

TO DO

Design AI shopping suggestions for homepage

Features

TASK

IN PROGRESS

Create AI-generated shopping suggestions for homepage

Features

STORY

Add advanced analytics tracking events

Analytics

STORY

DONE

Improve payment checkout time on mobile

Payments

BUG

Define requirements to use new AI integrations

Features

TASK

Interactive components

Drag-and-drop cards, editable fields—extended Atlassian design system

Drag-and-drop cards, editable fields—extended Atlassian design system

Realistic but simplified

Real Jira patterns + pre-filled content to reduce cognitive load

Real Jira patterns + pre-filled content to reduce cognitive load

Cross-Functional Collaboration

Led design direction while partnering closely with multiple teams to ensure technical feasibility, accurate messaging, and rigorous experiment methodology.

Product Manager

My Role: Presented data analysis showing 95% drop-off, proposed interactive board solution

Their Role: Defined success metrics, prioritized goals in alignment workshop, secured stakeholder buy-in

Engineering

My Role: Created detailed interaction specs, design system documentation, collaborated on technical constraints

Their Role: Built interactive components, implemented A/B test infrastructure, set up exposure tracking

Content Design

My Role: Designed step marker UI, microcopy placement, and content hierarchy strategy

Their Role: Wrote action-driven copy for each step, ensured clarity and brand
voice consistency

Data Science

My Role: Requested specific analytics on drop-off points and completion funnels

Their Role: Analyzed previous experiment data, calculated statistical significance, validated methodology

Testing Approach

The Experiment

Launched an A/B test targeting all desktop users visiting the English product tour page. The interactive board component replaced static "Plan" tab content in the treatment group, with the goal of increasing modal completion rates and driving more signups.

Control Group (50%)

Standard product tour with static images and text describing the board view.

Product tour tabs with static screenshots

Text descriptions of features

No interactive elements

Treatment Group (50%)

Enhanced experience with interactive board, step guidance, and hands-on exploration.

Drag-and-drop cards between columns

Editable fields and pre-filled epics

Step markers with action-driven microcopy

Live Testing

Shipped directly to production as an A/B experiment. Worked closely with Engineering to productionize the interactive components, then monitored engagement data including clicks, completions, and drop-off rates to validate the design and surface insights.

The components built for this experiment were designed to be production-ready, allowing us to quickly roll out successful patterns to other pages if results were positive.

Outcome

Results

UX Success

95%

Of users who saw the interactive board completed all steps—up from 2.4%. This proved the UX design worked as intended.

Business Metrics

+0.6%

No statistically significant increase in signups (±3.1%). But this wasn't a UX failure—it was an experiment design failure.

Four Methodology Flaws That Invalidated Our Results

01

Wrong Audience

30% of participants were logged-in users who already had Jira Software—diluting results.

Fix: Target only new-to-JSW users for accurate signup impact measurement

02

Below-the-Fold Placement

Only 37% of users scrolled far enough to see the interactive board—limiting impact.

Fix: Position key interactions above fold or track exposure on scroll

03

Exposure Tracking Error

Fired on page load, not when users actually saw the demo—counting people who never engaged.

Fix: Trigger exposure only when treatment visible in viewport

04

Platform Bug

Board styling flashed between dark/light mode due to platform change—impacting user experience.

Fix: Rigorous QA and stable technical environment before launch

What This Taught Me

01

Completion ≠ Conversion

Users engaging with features doesn't automatically drive business metrics. Both matter, but they measure different aspects of success.

02

Methodology = Design Quality

Even well-designed experiences need proper targeting, exposure tracking, and technical stability to prove business impact.

03

Question "Winning" Metrics

The 10% signup increase masked a 95% engagement failure. Always dig deeper than primary metrics.

04

Failed Experiments = Learning

This established best practices for targeting, positioning, and tracking that would drive a 66% win in the next experiment.

What This Taught Me

01

Completion ≠ Conversion

Users engaging with features doesn't automatically drive business metrics. Both matter, but they measure different aspects of success.

02

Methodology = Design Quality

Even well-designed experiences need proper targeting, exposure tracking, and technical stability to prove business impact.

03

Question "Winning" Metrics

The 10% signup increase masked a 95% engagement failure. Always dig deeper than primary metrics.

04

Failed Experiments = Learning

This established best practices for targeting, positioning, and tracking that would drive a 66% win in the next experiment.

What Happened Next

These weren't theoretical recommendations—I immediately applied all four fixes to redesign Google Ads landing pages for paid traffic.

+66%

Signup Increase

+69.7%

Account Creations

Statsig

All Metrics

This proved that experiment methodology is as critical as design quality for driving business results.

What Happened Next

These weren't theoretical recommendations—I immediately applied all four fixes to redesign Google Ads landing pages for paid traffic.

+66%

Signup Increase

+69.7%

Account Creations

Statsig

All Metrics

This proved that experiment methodology is as critical as design quality for driving business results.

  • More Works More Works

bystrom

bystrom

bystrom

bystrom

bildkritik

Go Back To Top

bildkritik

Go Back To Top