When a Demo Fails (And Why That’s Valuable)

We tested an interactive Jira Software demo to improve signups. While it didn’t increase conversions, it revealed valuable insights that shaped future onboarding experiments.

Product

Jira Software

What I did

Product Design

My role

design lead

Time line

6 weeks

When a Demo Fails (And Why That’s Valuable)

We tested an interactive Jira Software demo to improve signups. While it didn’t increase conversions, it revealed valuable insights that shaped future onboarding experiments.

Product

Jira Software

What I did

Product Design

My role

design lead

Time line

6 weeks

When a Demo Fails (And Why That’s Valuable)

We tested an interactive Jira Software demo to improve signups. While it didn’t increase conversions, it revealed valuable insights that shaped future onboarding experiments.

Product

Jira Software

What I did

Product Design

My role

design lead

Time line

6 weeks

When a Demo Fails (And Why That’s Valuable)

We tested an interactive Jira Software demo to improve signups. While it didn’t increase conversions, it revealed valuable insights that shaped future onboarding experiments.

Product

Jira Software

What I did

Product Design

My role

design lead

Time line

6 weeks

The Why

The Why

New users weren’t seeing Jira’s value before signup—static tours buried the story, driving high drop-offs.

A past interactive test had boosted signups by ~10%. Building on that, the challenge became: could we design a hands-on demo that let users try Jira before committing?

New users weren’t seeing Jira’s value before signup—static tours buried the story, driving high drop-offs.

A past interactive test had boosted signups by ~10%. Building on that, the challenge became: could we design a hands-on demo that let users try Jira before committing?

The Why

New users weren’t seeing Jira’s value before signup—static tours buried the story, driving high drop-offs.

A past interactive test had boosted signups by ~10%. Building on that, the challenge became: could we design a hands-on demo that let users try Jira before committing?

Project Constraints

Project Constraints

While the idea was ambitious, the project had to run as time-boxed growth experiment. That meant keeping scope lean, working within Atlassian's existing design system, and coordinating closely with PM and Engineering to move quickly without breaking quality.

While the idea was ambitious, the project had to run as time-boxed growth experiment. That meant keeping scope lean, working within Atlassian's existing design system, and coordinating closely with PM and Engineering to move quickly without breaking quality.

Six-week timeline

Six-week timeline to design, build, and launch.

Experiment only

Experiment only—couldn't invest in a full 0→1 product, but validated insights.

Design system patterns

Reused for speed, added custom components where needed.

While the idea was ambitious, the project had to run as time-boxed growth experiment. That meant keeping scope lean, working within Atlassian's existing design system, and coordinating closely with PM and Engineering to move quickly without breaking quality.

While the idea was ambitious, the project had to run as time-boxed growth experiment. That meant keeping scope lean, working within Atlassian's existing design system, and coordinating closely with PM and Engineering to move quickly without breaking quality.

Six-week timeline

Six-week timeline to design, build, and launch.

Experiment only

Experiment only—couldn't invest in a full 0→1 product, but validated insights.

Design system patterns

Reused for speed, added custom components where needed.

Project Constraints

While the idea was ambitious, the project had to run as time-boxed growth experiment. That meant keeping scope lean, working within Atlassian's existing design system, and coordinating closely with PM and Engineering to move quickly without breaking quality.

While the idea was ambitious, the project had to run as time-boxed growth experiment. That meant keeping scope lean, working within Atlassian's existing design system, and coordinating closely with PM and Engineering to move quickly without breaking quality.

Six-week timeline

Six-week timeline to design, build, and launch.

Experiment only

Experiment only—couldn't invest in a full 0→1 product, but validated insights.

Design system patterns

Reused for speed, added custom components where needed.

Design Process

I aligned the team with a Crazy Eights workshop where PM, Engineering, and Marketing prioritized their goals—enterprise credibility, simplicity, and onboarding. We scoped a lean MVP: a board-and-issue demo with pre-filled epics, editable fields, and guided microcopy.

Analyzing past experiment data revealed a 95% drop-off between updating an epic and adding a story. This pointed to friction in vague instructions and an overly open modal, which we focused on solving.

Design Process

I aligned the team with a Crazy Eights workshop where PM, Engineering, and Marketing prioritized their goals—enterprise credibility, simplicity, and onboarding. We scoped a lean MVP: a board-and-issue demo with pre-filled epics, editable fields, and guided microcopy.

Analyzing past experiment data revealed a 95% drop-off between updating an epic and adding a story. This pointed to friction in vague instructions and an overly open modal, which we focused on solving.

Wireframes

Grayscale layouts in Figma tested board structure; feedback pushed us toward a cleaner, streamlined version.

Wireframes

Grayscale layouts in Figma tested board structure; feedback pushed us toward a cleaner, streamlined version.

What we learned

What we learned

  • The demo didn’t directly increase signups.

  • It validated that exposing value early improves engagement.

  • Showed where users dropped off, shaping hypotheses for future onboarding and paid product tours.

Lessons learned from experiment to improve user engagement

Target Audience

30% of participants were existing, logged-in users—outside our intended cohort—diluting the results.

Challenge Identified

Future tests will target only "new-to-new" users to ensure relevance and clear impact.

Experiment Visibility

Our demo sat below the fold, so only 37% of users ever saw it—limiting its impact.

Challenge Identified

Position interactive elements above the fold to ensure they reach and engage the majority of visitors.

Data Dilution

Exposure fired on page load—counting users who never engaged and skewing our metrics.

Challenge Identified

Trigger exposure only after a user interacts with the demo to capture true engagement.

  • The demo didn’t directly increase signups.

  • It validated that exposing value early improves engagement.

  • Showed where users dropped off, shaping hypotheses for future onboarding and paid product tours.

Lessons learned from experiment to improve user engagement

Target Audience

30% of participants were existing, logged-in users—outside our intended cohort—diluting the results.

Challenge Identified

Future tests will target only "new-to-new" users to ensure relevance and clear impact.

Experiment Visibility

Our demo sat below the fold, so only 37% of users ever saw it—limiting its impact.

Challenge Identified

Position interactive elements above the fold to ensure they reach and engage the majority of visitors.

Data Dilution

Exposure fired on page load—counting users who never engaged and skewing our metrics.

Challenge Identified

Trigger exposure only after a user interacts with the demo to capture true engagement.

What we learned

  • The demo didn’t directly increase signups.

  • It validated that exposing value early improves engagement.

  • Showed where users dropped off, shaping hypotheses for future onboarding and paid product tours.

Lessons learned from experiment to improve user engagement

Target Audience

30% of participants were existing, logged-in users—outside our intended cohort—diluting the results.

Challenge Identified

Future tests will target only "new-to-new" users to ensure relevance and clear impact.

Experiment Visibility

Our demo sat below the fold, so only 37% of users ever saw it—limiting its impact.

Challenge Identified

Position interactive elements above the fold to ensure they reach and engage the majority of visitors.

Data Dilution

Exposure fired on page load—counting users who never engaged and skewing our metrics.

Challenge Identified

Trigger exposure only after a user interacts with the demo to capture true engagement.

Lessons learned from experiment to improve user engagement

Target Audience

30% of participants were existing, logged-in users—outside our intended cohort—diluting the results.

Challenge Identified

Future tests will target only "new-to-new" users to ensure relevance and clear impact.

Experiment Visibility

Our demo sat below the fold, so only 37% of users ever saw it—limiting its impact.

Challenge Identified

Position interactive elements above the fold to ensure they reach and engage visitors.

Data Dilution

Exposure fired on page load—counting users who never engaged and skewing our metrics.

Challenge Identified

Trigger exposure only after a user interacts with the demo to capture true engagement.

Final Design

These screens highlight the opening steps of the demo—showing how users were guided into Jira’s board and issue features.
While only the first steps are shown here, the full demo included complete task flows that generated the usage data and insights we analyzed.

Final Design

These screens highlight the opening steps of the demo—showing how users were guided into Jira’s board and issue features.
While only the first steps are shown here, the full demo included complete task flows that generated the usage data and insights we analyzed.

Final Design

These screens highlight the opening steps of the demo—showing how users were guided into Jira’s board and issue features.
While only the first steps are shown here, the full demo included complete task flows that generated the usage data and insights we analyzed.

Final Design

These screens highlight the opening steps of the demo—showing how users were guided into Jira’s board and issue features.
While only the first steps are shown here, the full demo included complete task flows that generated the usage data and insights we analyzed.

Reflection

Lessons Learned

Challenges:

  • Despite strong engagement, the demo did not reach the 5% signup goal.

  • Key obstacles included targeting a broad audience and the demo’s low-visibility placement.

Target Audience:

  • Challenge: The experiment targeted all users visiting the product tour, but 30% were already logged in and not in the target demographic.

  • Lesson: Narrow the target audience to new users (N2N) for future experiments.


  • Visibility:

    • Challenge: The demo was placed below the fold, with only 37% of users scrolling to view it.

    • Lesson: Ensure interactive elements are visible above the fold to maximize exposure.


  • Data Dilution:

    • Challenge: The exposure event fired at page load, including users who never interacted with the demo.

    • Lesson: Redefine exposure criteria to reflect actual interaction.

Reflection

Lessons Learned

Challenges:

  • Despite strong engagement, the demo did not reach the 5% signup goal.

  • Key obstacles included targeting a broad audience and the demo’s low-visibility placement.

Target Audience:

  • Challenge: The experiment targeted all users visiting the product tour, but 30% were already logged in and not in the target demographic.

  • Lesson: Narrow the target audience to new users (N2N) for future experiments.


  • Visibility:

    • Challenge: The demo was placed below the fold, with only 37% of users scrolling to view it.

    • Lesson: Ensure interactive elements are visible above the fold to maximize exposure.


  • Data Dilution:

    • Challenge: The exposure event fired at page load, including users who never interacted with the demo.

    • Lesson: Redefine exposure criteria to reflect actual interaction.

Reflection

Lessons Learned

Challenges:

  • Despite strong engagement, the demo did not reach the 5% signup goal.

  • Key obstacles included targeting a broad audience and the demo’s low-visibility placement.

Target Audience:

  • Challenge: The experiment targeted all users visiting the product tour, but 30% were already logged in and not in the target demographic.

  • Lesson: Narrow the target audience to new users (N2N) for future experiments.


  • Visibility:

    • Challenge: The demo was placed below the fold, with only 37% of users scrolling to view it.

    • Lesson: Ensure interactive elements are visible above the fold to maximize exposure.


  • Data Dilution:

    • Challenge: The exposure event fired at page load, including users who never interacted with the demo.

    • Lesson: Redefine exposure criteria to reflect actual interaction.

  • More Works More Works

  • More Works More Works

bystrom

bystrom

bystrom

bystrom

bildkritik

Go Back To Top

bildkritik

Go Back To Top