Interactive Demo
Interactive Demo
This case study explores a Jira Software Product Tour experiment aimed at increasing user signups through an interactive board and issue demo. By combining product demo with marketing, the project provided a hands-on experience to showcase Jira’s value and boost user confidence in its features.
Product
Jira Software
What I did
Product Design
My role
design lead
Time line
6 weeks
Impact
Impact
Goal: A 5% uplift in signup rates within 7 days.
Outcome: After 14 days, metrics remained flat. Though we fell short of the target, the project delivered key insights for future experiments.
Goal: A 5% uplift in signup rates within 7 days.
Outcome: After 14 days, metrics remained flat. Though we fell short of the target, the project delivered key insights for future experiments.
Impact
Goal: A 5% uplift in signup rates within 7 days.
Outcome: After 14 days, metrics remained flat. Though we fell short of the target, the project delivered key insights for future experiments.
Context
Context
Building on a previous win—where an interactive timeline demo boosted signups by 10%—the team hypothesized that highlighting Jira’s board and issue features could drive similar gains.
Goal: Increase signup rates by 5% by integrating an interactive demo into the Jira Software Product Tour, offering new users a clear demonstration of Jira’s core value and fostering their confidence in its capabilities.
Building on a previous win—where an interactive timeline demo boosted signups by 10%—the team hypothesized that highlighting Jira’s board and issue features could drive similar gains.
Goal: Increase signup rates by 5% by integrating an interactive demo into the Jira Software Product Tour, offering new users a clear demonstration of Jira’s core value and fostering their confidence in its capabilities.
Context
Building on a previous win—where an interactive timeline demo boosted signups by 10%—the team hypothesized that highlighting Jira’s board and issue features could drive similar gains.
Goal: Increase signup rates by 5% by integrating an interactive demo into the Jira Software Product Tour, offering new users a clear demonstration of Jira’s core value and fostering their confidence in its capabilities.
What we learned
What we learned
The post-experiment analysis uncovered audience mismatches and visibility gaps, prompting critical questions for future iterations—namely, how to narrow the target audience and ensure better exposure.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
The post-experiment analysis uncovered audience mismatches and visibility gaps, prompting critical questions for future iterations—namely, how to narrow the target audience and ensure better exposure.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
What we learned
The post-experiment analysis uncovered audience mismatches and visibility gaps, prompting critical questions for future iterations—namely, how to narrow the target audience and ensure better exposure.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Process
Process
This overview spans initial research through final execution and illustrates how I crafted an engaging interactive demo. The process prioritized user needs and delivered tangible results by leveraging data-driven insights and collaborative workshops.
Data Analysis:
Analyzed A/B testing results and heatmap data from the previous timeline demo to understand user behaviors and drop-off points.
Identified areas for improvement, such as visibility and step completion rates.
Workshops:
Facilitated a Crazy Eights workshop with PMs, engineers, and stakeholders to generate ideas and align on the demo's user flow.
Prioritized simplicity and engagement to maximize impact.
Design and Development:
Built a streamlined version of Jira Software, focused on board and issue features.
Added content-populated guiding cards and interactive elements (editable fields, pre-filled content) to create an engaging, step-by-step demo experience.
Data Analysis:
Analyzed A/B testing results and heatmap data from the previous timeline demo to understand user behaviors and drop-off points.
Identified areas for improvement, such as visibility and step completion rates.
Workshops:
Facilitated a Crazy Eights workshop with PMs, engineers, and stakeholders to generate ideas and align on the demo's user flow.
Prioritized simplicity and engagement to maximize impact.
Design and Development:
Built a streamlined version of Jira Software, focused on board and issue features.
Added content-populated guiding cards and interactive elements (editable fields, pre-filled content) to create an engaging, step-by-step demo experience.
Data Analysis:
Analyzed A/B testing results and heatmap data from the previous timeline demo to understand user behaviors and drop-off points.
Identified areas for improvement, such as visibility and step completion rates.
Workshops:
Facilitated a Crazy Eights workshop with PMs, engineers, and stakeholders to generate ideas and align on the demo's user flow.
Prioritized simplicity and engagement to maximize impact.
Design and Development:
Built a streamlined version of Jira Software, focused on board and issue features.
Added content-populated guiding cards and interactive elements (editable fields, pre-filled content) to create an engaging, step-by-step demo experience.
Data Analysis:
Analyzed A/B testing results and heatmap data from the previous timeline demo to understand user behaviors and drop-off points.
Identified areas for improvement, such as visibility and step completion rates.
Workshops:
Facilitated a Crazy Eights workshop with PMs, engineers, and stakeholders to generate ideas and align on the demo's user flow.
Prioritized simplicity and engagement to maximize impact.
Design and Development:
Built a streamlined version of Jira Software, focused on board and issue features.
Added content-populated guiding cards and interactive elements (editable fields, pre-filled content) to create an engaging, step-by-step demo experience.
This overview spans initial research through final execution and illustrates how I crafted an engaging interactive demo. The process prioritized user needs and delivered tangible results by leveraging data-driven insights and collaborative workshops.
Data Analysis:
Analyzed A/B testing results and heatmap data from the previous timeline demo to understand user behaviors and drop-off points.
Identified areas for improvement, such as visibility and step completion rates.
Workshops:
Facilitated a Crazy Eights workshop with PMs, engineers, and stakeholders to generate ideas and align on the demo's user flow.
Prioritized simplicity and engagement to maximize impact.
Design and Development:
Built a streamlined version of Jira Software, focused on board and issue features.
Added content-populated guiding cards and interactive elements (editable fields, pre-filled content) to create an engaging, step-by-step demo experience.
Data Analysis:
Analyzed A/B testing results and heatmap data from the previous timeline demo to understand user behaviors and drop-off points.
Identified areas for improvement, such as visibility and step completion rates.
Workshops:
Facilitated a Crazy Eights workshop with PMs, engineers, and stakeholders to generate ideas and align on the demo's user flow.
Prioritized simplicity and engagement to maximize impact.
Design and Development:
Built a streamlined version of Jira Software, focused on board and issue features.
Added content-populated guiding cards and interactive elements (editable fields, pre-filled content) to create an engaging, step-by-step demo experience.
Data Analysis:
Analyzed A/B testing results and heatmap data from the previous timeline demo to understand user behaviors and drop-off points.
Identified areas for improvement, such as visibility and step completion rates.
Workshops:
Facilitated a Crazy Eights workshop with PMs, engineers, and stakeholders to generate ideas and align on the demo's user flow.
Prioritized simplicity and engagement to maximize impact.
Design and Development:
Built a streamlined version of Jira Software, focused on board and issue features.
Added content-populated guiding cards and interactive elements (editable fields, pre-filled content) to create an engaging, step-by-step demo experience.
Data Analysis:
Analyzed A/B testing results and heatmap data from the previous timeline demo to understand user behaviors and drop-off points.
Identified areas for improvement, such as visibility and step completion rates.
Workshops:
Facilitated a Crazy Eights workshop with PMs, engineers, and stakeholders to generate ideas and align on the demo's user flow.
Prioritized simplicity and engagement to maximize impact.
Design and Development:
Built a streamlined version of Jira Software, focused on board and issue features.
Added content-populated guiding cards and interactive elements (editable fields, pre-filled content) to create an engaging, step-by-step demo experience.
Process
This overview spans initial research through final execution and illustrates how I crafted an engaging interactive demo. The process prioritized user needs and delivered tangible results by leveraging data-driven insights and collaborative workshops.
Data Analysis:
Analyzed A/B testing results and heatmap data from the previous timeline demo to understand user behaviors and drop-off points.
Identified areas for improvement, such as visibility and step completion rates.
Workshops:
Facilitated a Crazy Eights workshop with PMs, engineers, and stakeholders to generate ideas and align on the demo's user flow.
Prioritized simplicity and engagement to maximize impact.
Design and Development:
Built a streamlined version of Jira Software, focused on board and issue features.
Added content-populated guiding cards and interactive elements (editable fields, pre-filled content) to create an engaging, step-by-step demo experience.
Data Analysis:
Analyzed A/B testing results and heatmap data from the previous timeline demo to understand user behaviors and drop-off points.
Identified areas for improvement, such as visibility and step completion rates.
Workshops:
Facilitated a Crazy Eights workshop with PMs, engineers, and stakeholders to generate ideas and align on the demo's user flow.
Prioritized simplicity and engagement to maximize impact.
Design and Development:
Built a streamlined version of Jira Software, focused on board and issue features.
Added content-populated guiding cards and interactive elements (editable fields, pre-filled content) to create an engaging, step-by-step demo experience.
Data Analysis:
Analyzed A/B testing results and heatmap data from the previous timeline demo to understand user behaviors and drop-off points.
Identified areas for improvement, such as visibility and step completion rates.
Workshops:
Facilitated a Crazy Eights workshop with PMs, engineers, and stakeholders to generate ideas and align on the demo's user flow.
Prioritized simplicity and engagement to maximize impact.
Design and Development:
Built a streamlined version of Jira Software, focused on board and issue features.
Added content-populated guiding cards and interactive elements (editable fields, pre-filled content) to create an engaging, step-by-step demo experience.
Data Analysis:
Analyzed A/B testing results and heatmap data from the previous timeline demo to understand user behaviors and drop-off points.
Identified areas for improvement, such as visibility and step completion rates.
Workshops:
Facilitated a Crazy Eights workshop with PMs, engineers, and stakeholders to generate ideas and align on the demo's user flow.
Prioritized simplicity and engagement to maximize impact.
Design and Development:
Built a streamlined version of Jira Software, focused on board and issue features.
Added content-populated guiding cards and interactive elements (editable fields, pre-filled content) to create an engaging, step-by-step demo experience.



Final Design
The final interactive demo allowed users to explore Jira Software’s board and issue features in a hands-on, guided format, seamlessly merging product functionality with marketing for an authentic experience.
Challenges:
Despite strong engagement, the demo did not reach the 5% signup goal.
Key obstacles included targeting a broad audience and the demo’s low-visibility placement.
Final Design
The final interactive demo allowed users to explore Jira Software’s board and issue features in a hands-on, guided format, seamlessly merging product functionality with marketing for an authentic experience.
Challenges:
Despite strong engagement, the demo did not reach the 5% signup goal.
Key obstacles included targeting a broad audience and the demo’s low-visibility placement.
Final Design
The final interactive demo allowed users to explore Jira Software’s board and issue features in a hands-on, guided format, seamlessly merging product functionality with marketing for an authentic experience.
Challenges:
Despite strong engagement, the demo did not reach the 5% signup goal.
Key obstacles included targeting a broad audience and the demo’s low-visibility placement.
Final Design
The final interactive demo allowed users to explore Jira Software’s board and issue features in a hands-on, guided format, seamlessly merging product functionality with marketing for an authentic experience.
Challenges:
Despite strong engagement, the demo did not reach the 5% signup goal.
Key obstacles included targeting a broad audience and the demo’s low-visibility placement.
Reflection
Lessons Learned
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, but 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new users (N2N) for future experiments.
Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling to view it.
Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Reflection
Lessons Learned
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, but 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new users (N2N) for future experiments.
Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling to view it.
Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Reflection
Lessons Learned
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, but 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new users (N2N) for future experiments.
Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling to view it.
Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Reflection
Lessons Learned
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, but 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new users (N2N) for future experiments.
Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling to view it.
Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
More Works More Works
More Works More Works

Jira Software
Signup Experiment
2024
2024

Jira Software
Signup Experiment
2024
2024

Jira Software
Signup Experiment
2024
2024

Jira Software
Signup Experiment
2024
2024

Atlassian Platform
Webpage Redesign
2023
2023

Atlassian Platform
Webpage Redesign
2023
2023

Atlassian Platform
Webpage Redesign
2023
2023

Atlassian Platform
Webpage Redesign
2023
2023
bildkritik
Go Back To Top
bildkritik
Go Back To Top