When a Demo Fails (And Why That’s Valuable)
When a Demo Fails (And Why That’s Valuable)
We tested an interactive Jira Software demo to improve signups. While it didn’t increase conversions, it revealed valuable insights that shaped future onboarding experiments.
The Why
The Why
New users weren’t seeing Jira’s value before signup—static tours buried the story, driving high drop-offs.
A past interactive test had boosted signups by ~10%. Building on that, the challenge became: could we design a hands-on demo that let users try Jira before committing?
New users weren’t seeing Jira’s value before signup—static tours buried the story, driving high drop-offs.
A past interactive test had boosted signups by ~10%. Building on that, the challenge became: could we design a hands-on demo that let users try Jira before committing?
New users weren’t seeing Jira’s value before signup—static tours buried the story, driving high drop-offs.
A past interactive test had boosted signups by ~10%. Building on that, the challenge became: could we design a hands-on demo that let users try Jira before committing?
The Why
New users weren’t seeing Jira’s value before signup—static tours buried the story, driving high drop-offs.
A past interactive test had boosted signups by ~10%. Building on that, the challenge became: could we design a hands-on demo that let users try Jira before committing?
Project Constraints
Project Constraints
While the idea was ambitious, the project had to run as time-boxed growth experiment. That meant keeping scope lean, working within Atlassian's existing design system, and coordinating closely with PM and Engineering to move quickly without breaking quality.
While the idea was ambitious, the project had to run as time-boxed growth experiment. That meant keeping scope lean, working within Atlassian's existing design system, and coordinating closely with PM and Engineering to move quickly without breaking quality.
High ad spend
Traffic volume was critical; small % changes mattered.
Hero section only
Constrained to above-the-fold experience.
System reuse
Had to stay within Atlassian's existing components.
While the idea was ambitious, the project had to run as time-boxed growth experiment. That meant keeping scope lean, working within Atlassian's existing design system, and coordinating closely with PM and Engineering to move quickly without breaking quality.
High ad spend
Traffic volume was critical; small % changes mattered.
Hero section only
Constrained to above-the-fold experience.
System reuse
Had to stay within Atlassian's existing components.
Project Constraints
While the idea was ambitious, the project had to run as time-boxed growth experiment. That meant keeping scope lean, working within Atlassian's existing design system, and coordinating closely with PM and Engineering to move quickly without breaking quality.
High ad spend
Traffic volume was critical; small % changes mattered.
Hero section only
Constrained to above-the-fold experience.
System reuse
Had to stay within Atlassian's existing components.
Process
This overview spans initial research through final execution and illustrates how I crafted an engaging interactive demo. The process prioritized user needs and delivered tangible results by leveraging data-driven insights and collaborative workshops.
Attractiveness Analysis
To measure the success of the redesign, I used attractiveness analysis—assessing how visually appealing and engaging each element was for users. This helped determine whether the updated layout effectively captured and retained their interest.
Before the Redesign
The original hero section prioritized an embedded signup form, overshadowing the product shot. While functional, it didn’t draw users into the product visuals, reducing the overall appeal of the hero section.Engagement Rate Analysis
To gauge the redesign’s effectiveness, I analyzed engagement rates—tracking how users interacted with core elements on the page before and after the update. This metric revealed whether the new layout successfully captured user interest and encouraged deeper interaction.
Before the Redesign: While the original hero section’s embedded signup form had a high click rate, the product shot and other key information saw minimal engagement. This indicated that users were focused on signing up but missing valuable details about the product.
Attractiveness Analysis
To measure the success of the redesign, I used attractiveness analysis—assessing how visually appealing and engaging each element was for users. This helped determine whether the updated layout effectively captured and retained their interest.
Before the Redesign
The original hero section prioritized an embedded signup form, overshadowing the product shot. While functional, it didn’t draw users into the product visuals, reducing the overall appeal of the hero section.Engagement Rate Analysis
To gauge the redesign’s effectiveness, I analyzed engagement rates—tracking how users interacted with core elements on the page before and after the update. This metric revealed whether the new layout successfully captured user interest and encouraged deeper interaction.
Before the Redesign: While the original hero section’s embedded signup form had a high click rate, the product shot and other key information saw minimal engagement. This indicated that users were focused on signing up but missing valuable details about the product.
Attractiveness Analysis
To measure the success of the redesign, I used attractiveness analysis—assessing how visually appealing and engaging each element was for users. This helped determine whether the updated layout effectively captured and retained their interest.
Before the Redesign
The original hero section prioritized an embedded signup form, overshadowing the product shot. While functional, it didn’t draw users into the product visuals, reducing the overall appeal of the hero section.Engagement Rate Analysis
To gauge the redesign’s effectiveness, I analyzed engagement rates—tracking how users interacted with core elements on the page before and after the update. This metric revealed whether the new layout successfully captured user interest and encouraged deeper interaction.
Before the Redesign: While the original hero section’s embedded signup form had a high click rate, the product shot and other key information saw minimal engagement. This indicated that users were focused on signing up but missing valuable details about the product.
Attractiveness Analysis
To measure the success of the redesign, I used attractiveness analysis—assessing how visually appealing and engaging each element was for users. This helped determine whether the updated layout effectively captured and retained their interest.
Before the Redesign
The original hero section prioritized an embedded signup form, overshadowing the product shot. While functional, it didn’t draw users into the product visuals, reducing the overall appeal of the hero section.Engagement Rate Analysis
To gauge the redesign’s effectiveness, I analyzed engagement rates—tracking how users interacted with core elements on the page before and after the update. This metric revealed whether the new layout successfully captured user interest and encouraged deeper interaction.
Before the Redesign: While the original hero section’s embedded signup form had a high click rate, the product shot and other key information saw minimal engagement. This indicated that users were focused on signing up but missing valuable details about the product.
Impact
Outcome: After 14 days, rates remained flat—but the experiment surfaced critical insights to guide our next optimizations.
This overview spans initial research through final execution and illustrates how I crafted an engaging interactive demo. The process prioritized user needs and delivered tangible results by leveraging data-driven insights and collaborative workshops.
Fully Responsive
Explored layout variations to separate the signup form from the product shot and highlight value props.
Customize
Everything
Built new hero concepts using Atlassian’s design system, extending patterns for CTAs and visuals.
Impact
Outcome: After 14 days, rates remained flat—but the experiment surfaced critical insights to guide our next optimizations.
Outcome: After 14 days, rates remained flat—but the experiment surfaced critical insights to guide our next optimizations.
This overview spans initial research through final execution and illustrates how I crafted an engaging interactive demo. The process prioritized user needs and delivered tangible results by leveraging data-driven insights and collaborative workshops.
Fully Responsive
Explored layout variations to separate the signup form from the product shot and highlight value props.
Customize
Everything
Built new hero concepts using Atlassian’s design system, extending patterns for CTAs and visuals.
Treatment A
0%Account
Creation0%Viewed Email
OTP Screen-0%Signups
Changes:
Moved the product shot to the right and the redesigned signup form to the left.
Increased the size and prominence of the product shot.
Outcome: Despite positive interaction metrics, the overall signup rate decreased significantly compared to the control.
Treatment B
0%Account
Creation0%Viewed Email
OTP Screen-0%Signups
Changes:
Same setup as Treatment A with two additional elements:
An animated product shot to enhance engagement.
Social proof below the signup form.
Outcome: Smaller signup rate decrease (not statistically significant) compared to the control.
Treatment A
0%Account
Creation0%Viewed Email
OTP Screen-0%Signups
Changes:
Moved the product shot to the right and the redesigned signup form to the left.
Increased the size and prominence of the product shot.
Outcome: Despite positive interaction metrics, the overall signup rate decreased significantly compared to the control.
Treatment B
0%Account
Creation0%Viewed Email
OTP Screen-0%Signups
Changes:
Same setup as Treatment A with two additional elements:
An animated product shot to enhance engagement.
Social proof below the signup form.
Outcome: Smaller signup rate decrease (not statistically significant) compared to the control.
Treatment A
0%Account
Creation0%Viewed Email
OTP Screen-0%Signups
Changes:
Moved the product shot to the right and the redesigned signup form to the left.
Increased the size and prominence of the product shot.
Outcome: Despite positive interaction metrics, the overall signup rate decreased significantly compared to the control.
Treatment B
0%Account
Creation0%Viewed Email
OTP Screen-0%Signups
Changes:
Same setup as Treatment A with two additional elements:
An animated product shot to enhance engagement.
Social proof below the signup form.
Outcome: Smaller signup rate decrease (not statistically significant) compared to the control.
Treatment A
0%Account
Creation0%Viewed Email
OTP Screen-0%Signups
Changes:
Moved the product shot to the right and the redesigned signup form to the left.
Increased the size and prominence of the product shot.
Outcome: Despite positive interaction metrics, the overall signup rate decreased significantly compared to the control.
Treatment B
0%Account
Creation0%Viewed Email
OTP Screen-0%Signups
Changes:
Same setup as Treatment A with two additional elements:
An animated product shot to enhance engagement.
Social proof below the signup form.
Outcome: Smaller signup rate decrease (not statistically significant) compared to the control.
Treatment A
0%Account
Creation0%Viewed Email
OTP Screen-0%Signups
Changes:
Moved the product shot to the right and the redesigned signup form to the left.
Increased the size and prominence of the product shot.
Outcome: Despite positive interaction metrics, the overall signup rate decreased significantly compared to the control.
Treatment B
0%Account
Creation0%Viewed Email
OTP Screen-0%Signups
Changes:
Same setup as Treatment A with two additional elements:
An animated product shot to enhance engagement.
Social proof below the signup form.
Outcome: Smaller signup rate decrease (not statistically significant) compared to the control.
Treatment A
0%Account
Creation0%Viewed Email
OTP Screen-0%Signups
Changes:
Moved the product shot to the right and the redesigned signup form to the left.
Increased the size and prominence of the product shot.
Outcome: Despite positive interaction metrics, the overall signup rate decreased significantly compared to the control.
Treatment B
0%Account
Creation0%Viewed Email
OTP Screen-0%Signups
Changes:
Same setup as Treatment A with two additional elements:
An animated product shot to enhance engagement.
Social proof below the signup form.
Outcome: Smaller signup rate decrease (not statistically significant) compared to the control.
Treatment A
0%Account
Creation0%Viewed Email
OTP Screen-0%Signups
Changes:
Moved the product shot to the right and the redesigned signup form to the left.
Increased the size and prominence of the product shot.
Outcome: Despite positive interaction metrics, the overall signup rate decreased significantly compared to the control.
Treatment B
0%Account
Creation0%Viewed Email
OTP Screen-0%Signups
Changes:
Same setup as Treatment A with two additional elements:
An animated product shot to enhance engagement.
Social proof below the signup form.
Outcome: Smaller signup rate decrease (not statistically significant) compared to the control.
Treatment A
0%Account
Creation0%Viewed Email
OTP Screen-0%Signups
Changes:
Moved the product shot to the right and the redesigned signup form to the left.
Increased the size and prominence of the product shot.
Outcome: Despite positive interaction metrics, the overall signup rate decreased significantly compared to the control.
Treatment B
0%Account
Creation0%Viewed Email
OTP Screen-0%Signups
Changes:
Same setup as Treatment A with two additional elements:
An animated product shot to enhance engagement.
Social proof below the signup form.
Outcome: Smaller signup rate decrease (not statistically significant) compared to the control.
What we learned
What we learned
The demo didn’t directly increase signups.
It validated that exposing value early improves engagement.
Showed where users dropped off, shaping hypotheses for future onboarding and paid product tours.
The demo didn’t directly increase signups.
It validated that exposing value early improves engagement.
Showed where users dropped off, shaping hypotheses for future onboarding and paid product tours.
The demo didn’t directly increase signups.
It validated that exposing value early improves engagement.
Showed where users dropped off, shaping hypotheses for future onboarding and paid product tours.
What we learned
The demo didn’t directly increase signups.
It validated that exposing value early improves engagement.
Showed where users dropped off, shaping hypotheses for future onboarding and paid product tours.
Treatment A
Account
Creation
Viewed Email
OTP Screen
Signups
Changes:
Moved the product shot to the right and the redesigned signup form to the left.
Increased the size and prominence of the product shot.
Outcome: Despite positive interaction metrics, the overall signup rate decreased significantly compared to the control.
Treatment B
Account
Creation
Viewed Email
OTP Screen
Signups
Changes:
Same setup as Treatment A with two additional elements:
An animated product shot to enhance engagement.
Social proof below the signup form.
Outcome: Smaller signup rate decrease (not statistically significant) compared to the control.



Final Design
These screens highlight the opening steps of the demo—showing how users were guided into Jira’s board and issue features.
While only the first steps are shown here, the full demo included complete task flows that generated the usage data and insights we analyzed.
Final Design
These screens highlight the opening steps of the demo—showing how users were guided into Jira’s board and issue features.
While only the first steps are shown here, the full demo included complete task flows that generated the usage data and insights we analyzed.
Final Design
These screens highlight the opening steps of the demo—showing how users were guided into Jira’s board and issue features.
While only the first steps are shown here, the full demo included complete task flows that generated the usage data and insights we analyzed.
Final Design
These screens highlight the opening steps of the demo—showing how users were guided into Jira’s board and issue features.
While only the first steps are shown here, the full demo included complete task flows that generated the usage data and insights we analyzed.
Reflection
Lessons Learned
Challenges:
Despite strong engagement, the demo did not reach the 5% signup goal.
Key obstacles included targeting a broad audience and the demo’s low-visibility placement.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, but 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new users (N2N) for future experiments.
Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling to view it.
Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Reflection
Lessons Learned
Challenges:
Despite strong engagement, the demo did not reach the 5% signup goal.
Key obstacles included targeting a broad audience and the demo’s low-visibility placement.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, but 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new users (N2N) for future experiments.
Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling to view it.
Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Reflection
Lessons Learned
Challenges:
Despite strong engagement, the demo did not reach the 5% signup goal.
Key obstacles included targeting a broad audience and the demo’s low-visibility placement.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, but 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new users (N2N) for future experiments.
Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling to view it.
Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Reflection
Lessons Learned
Challenges:
Despite strong engagement, the demo did not reach the 5% signup goal.
Key obstacles included targeting a broad audience and the demo’s low-visibility placement.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, but 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new users (N2N) for future experiments.
Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling to view it.
Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
bildkritik
Go Back To Top
bildkritik
Go Back To Top