Interactive Demo Experiment
Interactive Demo Experiment
This case study showcases an experiment conducted on the Jira Software Product Tour, aimed at enhancing user engagement and driving signups by integrating an interactive demo for the board and issue features.
Product
Jira Software
what i did
product design
my role
design lead
Time line
6 weeks
Impact
Impact
Evaluating the experiment’s outcomes against the success criteria of a 5% uplift in signup rates revealed flat metrics. Despite this, the experiment provided valuable insights for future improvements.
Success Criteria: A 5% uplift in signup rate was defined as the benchmark for success, with an estimated runtime of 7 days to achieve significance.
Outcome: After 14 days, none of the metrics reached significance. The results were flat.
Evaluating the experiment’s outcomes against the success criteria of a 5% uplift in signup rates revealed flat metrics. Despite this, the experiment provided valuable insights for future improvements.
Success Criteria: A 5% uplift in signup rate was defined as the benchmark for success, with an estimated runtime of 7 days to achieve significance.
Outcome: After 14 days, none of the metrics reached significance. The results were flat.
Impact
Evaluating the experiment’s outcomes against the success criteria of a 5% uplift in signup rates revealed flat metrics. Despite this, the experiment provided valuable insights for future improvements.
Success Criteria: A 5% uplift in signup rate was defined as the benchmark for success, with an estimated runtime of 7 days to achieve significance.
Outcome: After 14 days, none of the metrics reached significance. The results were flat.
Context
Context
Drawing from the success of a prior interactive demo experiment, which achieved a 10% uplift in signups, this project aimed to replicate and build upon that success within a new context. The prior results informed the hypothesis and set expectations for this initiative.
Drawing from the success of a prior interactive demo experiment, which achieved a 10% uplift in signups, this project aimed to replicate and build upon that success within a new context. The prior results informed the hypothesis and set expectations for this initiative.
Context
Drawing from the success of a prior interactive demo experiment, which achieved a 10% uplift in signups, this project aimed to replicate and build upon that success within a new context. The prior results informed the hypothesis and set expectations for this initiative.
What we learned
What we learned
Reflecting on the experiment revealed critical insights about audience misalignment and visibility issues. These challenges prompted actionable questions for refining future experiments, such as narrowing the target audience and ensuring better exposure.
Target Audience:
The target audience for the experiment included all users visiting the English product tour page via desktop.
Data showed that 30% of these users were already logged in, diluting the results, as they were not the intended audience for generating new signups.
Experiment Visibility:
The exposure event was triggered at page load, but only 37% of users scrolled to view the interactive demo (located below the hero section)
The exposure population included users who never saw the experiment, further diluting observed metrics.
Key Questions for Reflection:
Should we have enrolled only users who scrolled to view the interactive demo?
Should the experiment have targeted only new users New to New, (N2N), excluding existing users?
Reflecting on the experiment revealed critical insights about audience misalignment and visibility issues. These challenges prompted actionable questions for refining future experiments, such as narrowing the target audience and ensuring better exposure.
Target Audience:
The target audience for the experiment included all users visiting the English product tour page via desktop.
Data showed that 30% of these users were already logged in, diluting the results, as they were not the intended audience for generating new signups.
Experiment Visibility:
The exposure event was triggered at page load, but only 37% of users scrolled to view the interactive demo (located below the hero section)
The exposure population included users who never saw the experiment, further diluting observed metrics.
Key Questions for Reflection:
Should we have enrolled only users who scrolled to view the interactive demo?
Should the experiment have targeted only new users New to New, (N2N), excluding existing users?
What we learned
Reflecting on the experiment revealed critical insights about audience misalignment and visibility issues. These challenges prompted actionable questions for refining future experiments, such as narrowing the target audience and ensuring better exposure.
Target Audience:
The target audience for the experiment included all users visiting the English product tour page via desktop.
Data showed that 30% of these users were already logged in, diluting the results, as they were not the intended audience for generating new signups.
Experiment Visibility:
The exposure event was triggered at page load, but only 37% of users scrolled to view the interactive demo (located below the hero section)
The exposure population included users who never saw the experiment, further diluting observed metrics.
Key Questions for Reflection:
Should we have enrolled only users who scrolled to view the interactive demo?
Should the experiment have targeted only new users New to New, (N2N), excluding existing users?
Process
Process
From initial research and ideation to final design and implementation, this section highlights the journey of crafting an engaging and impactful interactive demo. The process was guided by data-driven insights and collaborative workshops, ensuring the outcomes were both user-focused and effective.
Secondary Research:
Analyzed usability testing results from the first interactive demo experiment to uncover areas for improvement. Conducted competitive research to understand how leading companies design and present online demos.
Used data tools like Contentsquare and Amplitude to evaluate key metrics—engagement rate, click rate, and exit rate—guiding informed, data-driven design decisions.Workshops:
Facilitated a Crazy Eights workshop with the project manager, content designer, and lead engineer to generate a range of ideas quickly. This approach aligned cross-functional perspectives and encouraged creative thinking.
Design and Development:
Designed a simplified version of Jira Software as an interactive demo, complete with guiding cards to walk users through each step. The experience blended interactive elements, allowing users to input key information with pre-populated steps to create a streamlined and intuitive flow.
From initial research and ideation to final design and implementation, this section highlights the journey of crafting an engaging and impactful interactive demo. The process was guided by data-driven insights and collaborative workshops, ensuring the outcomes were both user-focused and effective.
Secondary Research:
Analyzed usability testing results from the first interactive demo experiment to uncover areas for improvement. Conducted competitive research to understand how leading companies design and present online demos.
Used data tools like Contentsquare and Amplitude to evaluate key metrics—engagement rate, click rate, and exit rate—guiding informed, data-driven design decisions.Workshops:
Facilitated a Crazy Eights workshop with the project manager, content designer, and lead engineer to generate a range of ideas quickly. This approach aligned cross-functional perspectives and encouraged creative thinking.
Design and Development:
Designed a simplified version of Jira Software as an interactive demo, complete with guiding cards to walk users through each step. The experience blended interactive elements, allowing users to input key information with pre-populated steps to create a streamlined and intuitive flow.
Process
From initial research and ideation to final design and implementation, this section highlights the journey of crafting an engaging and impactful interactive demo. The process was guided by data-driven insights and collaborative workshops, ensuring the outcomes were both user-focused and effective.
Secondary Research:
Analyzed usability testing results from the first interactive demo experiment to uncover areas for improvement. Conducted competitive research to understand how leading companies design and present online demos.
Used data tools like Contentsquare and Amplitude to evaluate key metrics—engagement rate, click rate, and exit rate—guiding informed, data-driven design decisions.Workshops:
Facilitated a Crazy Eights workshop with the project manager, content designer, and lead engineer to generate a range of ideas quickly. This approach aligned cross-functional perspectives and encouraged creative thinking.
Design and Development:
Designed a simplified version of Jira Software as an interactive demo, complete with guiding cards to walk users through each step. The experience blended interactive elements, allowing users to input key information with pre-populated steps to create a streamlined and intuitive flow.
Final Design
Final Design
The interactive demo’s final design aimed to blur the lines between product and product tour, providing a hands-on experience of key features. Despite its innovative approach, the results fell short of the expected 5% uplift in signup rates.
Challenges:
Results fell flat, failing to meet the 5% uplift target.
The experiment partially addressed the problem of allowing users to experience the product but did not lead to a measurable increase in signups.
The interactive demo’s final design aimed to blur the lines between product and product tour, providing a hands-on experience of key features. Despite its innovative approach, the results fell short of the expected 5% uplift in signup rates.
Challenges:
Results fell flat, failing to meet the 5% uplift target.
The experiment partially addressed the problem of allowing users to experience the product but did not lead to a measurable increase in signups.
Final Design
The interactive demo’s final design aimed to blur the lines between product and product tour, providing a hands-on experience of key features. Despite its innovative approach, the results fell short of the expected 5% uplift in signup rates.
Challenges:
Results fell flat, failing to meet the 5% uplift target.
The experiment partially addressed the problem of allowing users to experience the product but did not lead to a measurable increase in signups.
outcome
outcome
The findings emphasized the importance of precise audience targeting, improved visibility, and reassessed metrics for future experiments. While the experiment did not achieve its primary goal, it provided valuable insights into:
The importance of targeting a more precise audience for future experiments.
The need to improve experiment setup to ensure better exposure and engagement metrics.
The findings emphasized the importance of precise audience targeting, improved visibility, and reassessed metrics for future experiments. While the experiment did not achieve its primary goal, it provided valuable insights into:
The importance of targeting a more precise audience for future experiments.
The need to improve experiment setup to ensure better exposure and engagement metrics.
outcome
The findings emphasized the importance of precise audience targeting, improved visibility, and reassessed metrics for future experiments. While the experiment did not achieve its primary goal, it provided valuable insights into:
The importance of targeting a more precise audience for future experiments.
The need to improve experiment setup to ensure better exposure and engagement metrics.
More Works More Works
More Works More Works
cesar bystrom
GO BACK TO TOP
cesar bystrom
GO BACK TO TOP