A/B Testing Case Study

From 10% Win to Flat

Results: Diagnosing

Interactive Demo

Failure

Designed interactive board component

for JSW product tour to address 95%

drop-off rate from previous experiment.

Led design process including

competitive analysis, cross-functional

collaboration, and component design

that extended Atlassian's design system.

Product Design

A/B Testing

Interactive Components

Product

JSW Product Tour

What I Did

Design & Research

Role

Product Designer

Timeline

6 Weeks

The Context

The Challenge

The previous interactive timeline

experiment showed promising results

with a roughly 10% signup uplift.

However, deeper analysis revealed a

critical issue: only 2.4% of 31,072

visitors completed the entire modal

experience.

The data showed a massive 95.5%

drop-off between the first interaction

(updating an epic) and the second step

(adding a story). This suggested the

interactive demo wasn't effectively

guiding users through the full product

value proposition.

"Overall positive results for interactive

demo with high engagement towards the

first action of the demo and 2.4% of

visitors finishing entire modal

experience."

— Previous Experiment Results

Previous Experiment Data

Total Visitors

31,072

Step 1: Update Epic

30,394 (97.8%)

Step 2: Add Story

1,379 (4.4%)

Completed Modal

740 (2.4%)

Drop-off Rate

95.5%

Problem Statement

The previous interactive demo

achieved initial engagement but

failed to guide users through the

complete experience. With 95% of

users dropping off after the first

interaction, we weren't effectively

demonstrating JSW's full value

proposition. This represented a

significant missed opportunity to

convert engaged visitors into

signups.

Goal:

Design an interactive board

experience that reduces drop-off and

increases modal completion rates,

ultimately driving more product

signups.

Digging Deeper

Design Process

Research & Analysis

Data Analysis

Analyzed the winning timeline

experiment data, identifying the critical

95% drop-off point between first and

second interactions. This revealed users

needed clearer guidance and more

compelling reasons to continue through

the modal.

Reviewed interaction patterns across

all 8 modal steps

Identified engagement drop-off

points

Studied completion rates and user

behavior patterns

Competitive Teardown

Analyzed how competitors like Dooly

displayed interactive product tours.

Studied their use of step markers,

guided copy, and progressive disclosure

to keep users engaged through

multi-step

experiences.

[Dooly Interactive Tour Screenshot]

Example: Dooly's step-by-step guided

tour approach

Product Study

Spent time in the actual Jira product

understanding authentic board

interactions—how users create epics,

add stories, update fields, and manage

timelines. This ensured our demo would

feel realistic rather than oversimplified.

Collaboration & Scoping

Aligned the team with a prioritization

workshop where PM, Engineering, and

Marketing prioritized goals—enterprise

credibility, simplicity, and onboarding.

We scoped a lean MVP: a

board-and-issue

demo with pre-filled epics,

editable fields, and guided microcopy.

Project Constraints

Six-week timeline

Fast turnaround required leveraging

existing design system components

rather than custom designs.

Experiment-first approach

Built as testable MVP rather than full

product feature to validate approach

before larger investment.

Design system extension

Used Atlassian design system for

consistency, extended it with new

interactive patterns.

Wireframes

Created grayscale layouts in Figma

testing different board structures and

interaction flows. Feedback from the

team pushed us toward a cleaner, more

streamlined version that focused on the

core board interactions.

[Wireframe: Roadmap board layout

exploration]

Early wireframes exploring board layouts,

card structures, and epic organization

High-fidelity Design

Built on Atlassian's design system for

consistency, but extended it with new

interactive components: drag-and-drop

cards, editable fields, and step markers

with action-driven copy. Collaborated

closely with Engineering and Content

Design to refine interaction details and

ensure technical feasibility.

[Hi-fi: Interactive board with step

guidance]

Final design showing interactive board

with guided step-by-step instructions

Key Design Decisions

Step-by-step guidance

Added clear visual step markers and

action-driven microcopy at each stage

to guide users through the experience

and reduce confusion about what to do

next.

Interactive components

Designed drag-and-drop functionality,

editable text fields, and clickable

elements that extended Atlassian's

design system while maintaining brand

consistency.

Realistic but simplified

Balanced authenticity with

simplicity—used

real Jira patterns but pre-filled

content to reduce cognitive load and

keep focus on key interactions.

Digging Deeper

The Experiment

Launched an A/B test targeting all

desktop users visiting the English

product tour page. The interactive board

component replaced static "Plan" tab

content in the treatment group, with the

goal of increasing modal completion

rates and driving more signups.

Control Group (50%)

Static Plan Tab Content

Standard product tour with static images

and text describing the board view.

Product tour tabs with static

screenshots

Text descriptions of features

No interactive elements

Treatment Group (50%)

Interactive Board Component

Enhanced experience with interactive

board, step guidance, and hands-on

exploration.

Drag-and-drop cards between columns

Editable fields and pre-filled epics

Step markers with action-driven

microcopy

Live Testing

Shipped directly to production as

an A/B experiment. Worked closely

with Engineering to productionize

the interactive components, then

monitored engagement data

including clicks, completions, and

drop-off rates to validate the

design and surface insights.

The components built for this

experiment were designed to be

production-ready, allowing us to quickly

roll out successful patterns to other

pages if results were positive.

Outcome

Results

Experiment Summary

Pre-analysis set success criteria at

5% uplift in signup rate with an

estimated 7-day runtime. After 14

days, metrics didn't reach

statistical significance. However,

the experiment revealed critical

methodology issues more valuable

than a simple pass/fail result.

Target

Uplift

5%

Signup

Rate

Estimated

Runtime

7

Days

Actual

Runtime

14

Days

Scorecard Metrics

Experiment results for primary and

secondary metrics

-0.2%

JSW D0DAI

±3.5%

+0.6%

JSW Signups

±3.1%

278K

Total

Exposures

95%

Confidence

Level

Conclusion

The interactive board demo didn't

produce the expected 5% uplift in

signup rates. After running twice the

estimated time, results showed minimal

impact (+0.6% ±3.1% signups, -0.2%

±3.5% engagement). None of the

changes reached statistical

significance, but the experiment

surfaced critical issues with targeting

and visibility that informed future

testing methodology.

Three critical issues discovered

Target Audience

About 30% of participants were

existing, logged-in users—outside our

intended audience of new

prospects—which

diluted results.

Challenge Identified

Future tests should target only

new-to-new

users to ensure relevance and

clearer impact on signup metrics.

Experiment Visibility

The board demo sat below the fold, so

only 37% of users scrolled far enough to

see it—limiting potential impact.

Challenge Identified

Position interactive elements above the

fold or fire exposure events only when

users scroll to the treatment.

Data Dilution

Exposure events fired on page

load—counting

users who never engaged with

the demo and skewing metrics

downward.

Challenge Identified

Trigger exposure only after users

interact with the demo to capture true

engagement and treatment effect.

What We Learned

The demo didn't directly

increase signups, but validated

that exposing value early

improves engagement when

properly targeted.

Showed where users dropped

off, shaping hypotheses for

future onboarding

improvements and paid

product tour iterations.

Established rigorous

framework for experiment

design: precise audience

targeting, viewport/scroll

considerations, and proper

exposure event timing.

Reflections

Key Learnings

Design Process Skills

01

Data-driven design

decisions

Used previous experiment data

to identify the 95% drop-off

problem and inform design

direction

02

Competitive analysis

Studied how competitors solve

similar problems to inform

interaction patterns and

guidance approaches

03

Cross-functional

collaboration

Led alignment workshops and

worked closely with Engineering

and Content to refine designs

for production

04

Design system extension

Extended Atlassian's design

system with new interactive

patterns while maintaining brand

consistency

Long-term Impact

01

Established best practices

Created guidelines for

below-the-fold

experiments adopted by

the growth team

02

Improved targeting

precision

Influenced how the team defined

experiment populations for

clearer results

03

Better exposure tracking

Changed how the team fired

exposure events to capture true

engagement

04

Reusable components

Built production-ready

interactive components that

could be deployed to other

pages

A/B Testing Case Study

From 10% Win to Flat

Results: Diagnosing

Interactive Demo Failure

Designed interactive board component for JSW product tour to address 95%

drop-off rate from previous experiment. Led design process including

competitive analysis, cross-functional collaboration, and component design

that extended Atlassian's design system.

Product Design

A/B Testing

Interactive Components

Product

JSW Product

Tour

What I Did

Design &

Research

Role

Product

Designer

Timeline

6 Weeks

The Context

The Challenge

The previous interactive timeline

experiment showed promising results

with a roughly 10% signup uplift.

However, deeper analysis revealed a

critical issue: only 2.4% of 31,072

visitors completed the entire modal

experience.

The data showed a massive 95.5%

drop-off

between the first interaction

(updating an epic) and the second step

(adding a story). This suggested the

interactive demo wasn't effectively

guiding users through the full product

value proposition.

"Overall positive results for interactive

demo with high engagement towards the

first action of the demo and 2.4% of

visitors finishing entire modal

experience."

— Previous Experiment Results

Previous Experiment Data

Total Visitors

31,072

Step 1: Update Epic

30,394 (97.8%)

Step 2: Add Story

1,379 (4.4%)

Completed Modal

740 (2.4%)

Drop-off Rate

95.5%

Problem Statement

The previous interactive demo achieved initial engagement but failed to guide users

through the complete experience. With 95% of users dropping off after the first

interaction, we weren't effectively demonstrating JSW's full value proposition. This

represented a significant missed opportunity to convert engaged visitors into signups.

Goal:

Design an interactive board experience that reduces drop-off and increases modal completion

rates, ultimately driving more product signups.

Digging Deeper

Design Process

Research & Analysis

Data Analysis

Analyzed the winning timeline experiment

data, identifying the critical 95% drop-off

point between first and second

interactions. This revealed users needed

clearer guidance and more compelling

reasons to continue through the modal.

Reviewed interaction patterns across all

8 modal steps

Identified engagement drop-off points

Studied completion rates and user

behavior patterns

Competitive Teardown

Analyzed how competitors like Dooly

displayed interactive product tours.

Studied their use of step markers, guided

copy, and progressive disclosure to keep

users engaged through multi-step

experiences.

[Dooly Interactive Tour Screenshot]

Example: Dooly's step-by-step guided tour

approach

Product Study

Spent time in the actual Jira product understanding authentic board interactions—how users create

epics, add stories, update fields, and manage timelines. This ensured our demo would feel realistic

rather than oversimplified.

Collaboration & Scoping

Aligned the team with a prioritization workshop where PM, Engineering, and Marketing

prioritized goals—enterprise credibility, simplicity, and onboarding. We scoped a lean

MVP: a board-and-issue demo with pre-filled epics, editable fields, and guided

microcopy.

Project Constraints

Six-week timeline

Fast turnaround required

leveraging existing design

system components rather

than custom designs.

Experiment-first approach

Built as testable MVP rather

than full product feature to

validate approach before

larger investment.

Design system extension

Used Atlassian design system

for consistency, extended it

with new interactive patterns.

Wireframes

Created grayscale layouts in Figma

testing different board structures and

interaction flows. Feedback from the

team pushed us toward a cleaner, more

streamlined version that focused on the

core board interactions.

[Wireframe: Roadmap board layout

exploration]

Early wireframes exploring board layouts,

card structures, and epic organization

High-fidelity Design

Built on Atlassian's design system for

consistency, but extended it with new

interactive components: drag-and-drop

cards, editable fields, and step markers

with action-driven copy. Collaborated

closely with Engineering and Content

Design to refine interaction details and

ensure technical feasibility.

[Hi-fi: Interactive board with step

guidance]

Final design showing interactive board

with guided step-by-step instructions

Key Design Decisions

Step-by-step guidance

Added clear visual step markers and action-driven microcopy at each stage to guide users through

the experience and reduce confusion about what to do next.

Interactive components

Designed drag-and-drop functionality, editable text fields, and clickable elements that extended

Atlassian's design system while maintaining brand consistency.

Realistic but simplified

Balanced authenticity with simplicity—used real Jira patterns but pre-filled content to reduce

cognitive load and keep focus on key interactions.

Digging Deeper

The Experiment

Launched an A/B test targeting all desktop users visiting the English product tour page.

The interactive board component replaced static "Plan" tab content in the treatment

group, with the goal of increasing modal completion rates and driving more signups.

Control Group (50%)

Static Plan Tab Content

Standard product tour with static images and

text describing the board view.

Product tour tabs with static screenshots

Text descriptions of features

No interactive elements

Treatment Group (50%)

Interactive Board Component

Enhanced experience with interactive board,

step guidance, and hands-on exploration.

Drag-and-drop cards between columns

Editable fields and pre-filled epics

Step markers with action-driven

microcopy

Live Testing

Shipped directly to production as an A/B experiment. Worked closely with Engineering to

productionize the interactive components, then monitored engagement data including

clicks, completions, and drop-off rates to validate the design and surface insights.

The components built for this experiment were designed to be production-ready, allowing us to

quickly roll out successful patterns to other pages if results were positive.

Outcome

Results

Experiment Summary

Pre-analysis set success criteria at 5% uplift in signup rate with an estimated 7-day

runtime. After 14 days, metrics didn't reach statistical significance. However, the

experiment revealed critical methodology issues more valuable than a simple pass/fail

result.

Target Uplift

5% Signup Rate

Estimated Runtime

7 Days

Actual Runtime

14 Days

Scorecard Metrics

Experiment results for primary and secondary metrics

-0.2%

JSW D0DAI

±3.5%

+0.6%

JSW Signups

±3.1%

278K

Total Exposures

95%

Confidence Level

Conclusion

The interactive board demo didn't produce the expected 5% uplift in signup rates. After running

twice the estimated time, results showed minimal impact (+0.6% ±3.1% signups, -0.2% ±3.5%

engagement). None of the changes reached statistical significance, but the experiment surfaced

critical issues with targeting and visibility that informed future testing methodology.

Three critical issues discovered

Target Audience

About 30% of

participants were

existing, logged-in

users—outside our

intended audience of

new prospects—which

diluted results.

Challenge Identified

Future tests should

target only new-to-new

users to ensure

relevance and clearer

impact on signup

metrics.

Experiment

Visibility

The board demo sat

below the fold, so only

37% of users scrolled

far enough to see

it—limiting

potential

impact.

Challenge Identified

Position interactive

elements above the

fold or fire exposure

events only when users

scroll to the treatment.

Data Dilution

Exposure events fired

on page load—counting

users who never

engaged with the demo

and skewing metrics

downward.

Challenge Identified

Trigger exposure only

after users interact

with the demo to

capture true

engagement and

treatment effect.

What We Learned

The demo didn't directly increase signups, but validated that exposing value early

improves engagement when properly targeted.

Showed where users dropped off, shaping hypotheses for future onboarding

improvements and paid product tour iterations.

Established rigorous framework for experiment design: precise audience targeting,

viewport/scroll considerations, and proper exposure event timing.

Reflections

Key Learnings

Design Process Skills

01

Data-driven design decisions

Used previous experiment data to

identify the 95% drop-off problem

and inform design direction

02

Competitive analysis

Studied how competitors solve

similar problems to inform

interaction patterns and guidance

approaches

03

Cross-functional collaboration

Led alignment workshops and

worked closely with Engineering

and Content to refine designs for

production

04

Design system extension

Extended Atlassian's design

system with new interactive

patterns while maintaining brand

consistency

Long-term Impact

01

Established best practices

Created guidelines for

below-the-fold

experiments adopted by the

growth team

02

Improved targeting precision

Influenced how the team defined

experiment populations for clearer

results

03

Better exposure tracking

Changed how the team fired

exposure events to capture true

engagement

04

Reusable components

Built production-ready interactive

components that could be

deployed to other pages

A/B Testing Case Study

From 10% Win to Flat Results:

Diagnosing Interactive Demo Failure

Designed interactive board component for JSW product tour to address 95%

drop-off rate from previous experiment. Led design process including

competitive analysis, cross-functional collaboration, and component design

that extended Atlassian's design system.

Product Design

A/B Testing

Interactive Components

Product

JSW Product Tour

What I Did

Design & Research

Role

Product Designer

Timeline

6 Weeks

The Context

The Challenge

The previous interactive timeline experiment showed

promising results with a roughly 10% signup uplift.

However, deeper analysis revealed a critical issue: only

2.4% of 31,072 visitors completed the entire modal

experience.

The data showed a massive 95.5% drop-off between the

first interaction (updating an epic) and the second step

(adding a story). This suggested the interactive demo

wasn't effectively guiding users through the full product

value proposition.

"Overall positive results for interactive demo with high

engagement towards the first action of the demo and 2.4%

of visitors finishing entire modal experience."

— Previous Experiment Results

Previous Experiment Data

Total Visitors

31,072

Step 1: Update Epic

30,394 (97.8%)

Step 2: Add Story

1,379 (4.4%)

Completed Modal

740 (2.4%)

Drop-off Rate

95.5%

Problem Statement

The previous interactive demo achieved initial engagement but failed to guide users through the complete experience. With

95% of users dropping off after the first interaction, we weren't effectively demonstrating JSW's full value proposition. This

represented a significant missed opportunity to convert engaged visitors into signups.

Goal:

Design an interactive board experience that reduces drop-off and increases modal completion rates, ultimately driving more product

signups.

Digging Deeper

Design Process

Research & Analysis

Data Analysis

Analyzed the winning timeline experiment data, identifying the

critical 95% drop-off point between first and second

interactions. This revealed users needed clearer guidance and

more compelling reasons to continue through the modal.

Reviewed interaction patterns across all 8 modal steps

Identified engagement drop-off points

Studied completion rates and user behavior patterns

Competitive Teardown

Analyzed how competitors like Dooly displayed interactive

product tours. Studied their use of step markers, guided copy,

and progressive disclosure to keep users engaged through

multi-step experiences.

[Dooly Interactive Tour Screenshot]

Example: Dooly's step-by-step guided tour approach

Product Study

Spent time in the actual Jira product understanding authentic board interactions—how users create epics, add stories, update fields, and

manage timelines. This ensured our demo would feel realistic rather than oversimplified.

Collaboration & Scoping

Aligned the team with a prioritization workshop where PM, Engineering, and Marketing prioritized goals—enterprise

credibility, simplicity, and onboarding. We scoped a lean MVP: a board-and-issue demo with pre-filled epics, editable

fields, and guided microcopy.

Project Constraints

Six-week timeline

Fast turnaround required leveraging

existing design system components rather

than custom designs.

Experiment-first approach

Built as testable MVP rather than full

product feature to validate approach before

larger investment.

Design system extension

Used Atlassian design system for

consistency, extended it with new

interactive patterns.

Wireframes

Created grayscale layouts in Figma testing different

board structures and interaction flows. Feedback from

the team pushed us toward a cleaner, more streamlined

version that focused on the core board interactions.

[Wireframe: Roadmap board layout exploration]

Early wireframes exploring board layouts, card structures, and

epic organization

High-fidelity Design

Built on Atlassian's design system for consistency, but

extended it with new interactive components:

drag-and-drop

cards, editable fields, and step markers with

action-driven copy. Collaborated closely with

Engineering and Content Design to refine interaction

details and ensure technical feasibility.

[Hi-fi: Interactive board with step guidance]

Final design showing interactive board with guided

step-by-step

instructions

Key Design Decisions

Step-by-step guidance

Added clear visual step markers and action-driven microcopy at each stage to guide users through the experience and reduce confusion

about what to do next.

Interactive components

Designed drag-and-drop functionality, editable text fields, and clickable elements that extended Atlassian's design system while maintaining

brand consistency.

Realistic but simplified

Balanced authenticity with simplicity—used real Jira patterns but pre-filled content to reduce cognitive load and keep focus on key

interactions.

Digging Deeper

The Experiment

Launched an A/B test targeting all desktop users visiting the English product tour page. The interactive board component

replaced static "Plan" tab content in the treatment group, with the goal of increasing modal completion rates and driving

more signups.

Control Group (50%)

Static Plan Tab Content

Standard product tour with static images and text describing the

board view.

Product tour tabs with static screenshots

Text descriptions of features

No interactive elements

Treatment Group (50%)

Interactive Board Component

Enhanced experience with interactive board, step guidance, and

hands-on exploration.

Drag-and-drop cards between columns

Editable fields and pre-filled epics

Step markers with action-driven microcopy

Live Testing

Shipped directly to production as an A/B experiment. Worked closely with Engineering to productionize the interactive

components, then monitored engagement data including clicks, completions, and drop-off rates to validate the design and

surface insights.

The components built for this experiment were designed to be production-ready, allowing us to quickly roll out successful patterns to other

pages if results were positive.

Outcome

Results

Experiment Summary

Pre-analysis set success criteria at 5% uplift in signup rate with an estimated 7-day runtime. After 14 days, metrics didn't

reach statistical significance. However, the experiment revealed critical methodology issues more valuable than a simple

pass/fail result.

Target Uplift

5% Signup Rate

Estimated Runtime

7 Days

Actual Runtime

14 Days

Scorecard Metrics

Experiment results for primary and secondary metrics

-0.2%

JSW D0DAI

±3.5%

+0.6%

JSW Signups

±3.1%

278K

Total Exposures

95%

Confidence Level

Conclusion

The interactive board demo didn't produce the expected 5% uplift in signup rates. After running twice the estimated time, results showed

minimal impact (+0.6% ±3.1% signups, -0.2% ±3.5% engagement). None of the changes reached statistical significance, but the experiment

surfaced critical issues with targeting and visibility that informed future testing methodology.

Three critical issues discovered

Target Audience

About 30% of participants were

existing, logged-in users—outside

our intended audience of new

prospects—which diluted results.

Challenge Identified

Future tests should target only

new-to-new

users to ensure relevance

and clearer impact on signup

metrics.

Experiment Visibility

The board demo sat below the fold,

so only 37% of users scrolled far

enough to see it—limiting potential

impact.

Challenge Identified

Position interactive elements above

the fold or fire exposure events only

when users scroll to the treatment.

Data Dilution

Exposure events fired on page

load—counting

users who never

engaged with the demo and skewing

metrics downward.

Challenge Identified

Trigger exposure only after users

interact with the demo to capture

true engagement and treatment

effect.

What We Learned

The demo didn't directly increase signups, but validated that exposing value early improves engagement when properly

targeted.

Showed where users dropped off, shaping hypotheses for future onboarding improvements and paid product tour

iterations.

Established rigorous framework for experiment design: precise audience targeting, viewport/scroll considerations, and

proper exposure event timing.

Reflections

Key Learnings

Design Process Skills

01

Data-driven design decisions

Used previous experiment data to identify the 95%

drop-off problem and inform design direction

02

Competitive analysis

Studied how competitors solve similar problems to

inform interaction patterns and guidance approaches

03

Cross-functional collaboration

Led alignment workshops and worked closely with

Engineering and Content to refine designs for

production

04

Design system extension

Extended Atlassian's design system with new

interactive patterns while maintaining brand

consistency

Long-term Impact

01

Established best practices

Created guidelines for below-the-fold experiments

adopted by the growth team

02

Improved targeting precision

Influenced how the team defined experiment

populations for clearer results

03

Better exposure tracking

Changed how the team fired exposure events to

capture true engagement

04

Reusable components

Built production-ready interactive components that

could be deployed to other pages

  • More Works More Works

  • More Works More Works

bystrom

bystrom

bystrom

bystrom

bildkritik

Go Back To Top

bildkritik

Go Back To Top