When a Demo Fails (And Why That’s Valuable)

When a Demo Fails (And Why That’s Valuable)

Jira Software

platform

What I did

Product Design

My role

design lead

Timeline

13 Weeks

Before
After

The Why

New users weren’t seeing Jira’s value before signup—static tours buried the story, driving high drop-offs.

A past interactive test had boosted signups by ~10%. Building on that, the challenge became: could we design a hands-on demo that let users try Jira before committing?

Project Constraints

I aligned the team with a Crazy Eights workshop where PM, Engineering, and Marketing prioritized their goals—enterprise credibility, simplicity, and onboarding. We scoped a lean MVP: a board-and-issue demo with pre-filled epics, editable fields, and guided microcopy.

Analyzing past experiment data revealed a 95% drop-off between updating an epic and adding a story. This pointed to friction in vague instructions and an overly open modal, which we focused on solving.

What we learned

  • The demo didn’t directly increase signups.

  • It validated that exposing value early improves engagement.

  • Showed where users dropped off, shaping hypotheses for future onboarding and paid product tours.

What is Platform?

Who uses platform?

What do I do next?

What is Platform?

Who uses platform?

What do I do next?

0%

Reduced Bounce Rate

0%

Response Rate

$0K

Sales Pipeline

The Why

New users weren’t seeing Jira’s value before signup—static tours buried the story, driving high drop-offs.

A past interactive test had boosted signups by ~10%. Building on that, the challenge became: could we design a hands-on demo that let users try Jira before committing?

Project Constraints

While the idea was ambitious, the project had to run as time-boxed growth experiment. That meant keeping scope lean, working within Atlassian's existing design system, and coordinating closely with PM and Engineering to move quickly without breaking quality.

Research-heavy

Had to validate with interviews + heatmaps before changes.

Enterprise alignment

Multiple stakeholders (PM, Eng, Marketing) to reconcile.

13-week scope

Limited sprint cycle to deliver redesign end-to-end.

Project Constraints

While the idea was ambitious, the project had to run as time-boxed growth experiment. That meant keeping scope lean, working within Atlassian's existing design system, and coordinating closely with PM and Engineering to move quickly without breaking quality.

Research-heavy

Had to validate with interviews + heatmaps before changes.

Enterprise alignment

Multiple stakeholders (PM, Eng, Marketing) to reconcile.

13-week scope

Limited sprint cycle to deliver redesign end-to-end.

Design Process

Audit

Reviewed analytics, heatmaps, and competitor benchmarks to identify key pain points.

Analyzed user behavior, scroll depth, and conversion funnels (80% bounce).
Benchmarked competitor pages and ran a content audit.
Identified 3 critical issues: unclear value proposition, poor navigation hierarchy, weak CTA placement.

Wireframes

Partnered with content design to restructure hierarchy and clarify value props.

Created low → high fidelity wireframes, iterating weekly with async share-outs + design crits.
Referenced content audit and competitor benchmarks for CTA placement.
Collaborated closely with Eng to avoid blockers.

Testing

Ran usability sessions to validate navigation clarity, messaging, and CTA hierarchy.

Conducted 13 sessions across 3 enterprise personas (admins, champions, leaders).
Asked targeted questions on comprehension + recall of key value props.
Refined flows and microcopy based on user feedback.

Launch

QA’d with PM and Eng to preserve SEO and accessibility before rollout.

Validated consistency with Atlassian’s design system.
Documented specs for engineering QA and SEO compliance.
Rolled out redesign across multiple platform pages in 13 weeks.

Audit

Reviewed analytics, heatmaps, and competitor benchmarks to identify key pain points.

Analyzed user behavior, scroll depth, and conversion funnels (80% bounce).
Benchmarked competitor pages and ran a content audit.
Identified 3 critical issues: unclear value proposition, poor navigation hierarchy, weak CTA placement.

Wireframes

Partnered with content design to restructure hierarchy and clarify value props.

Created low → high fidelity wireframes, iterating weekly with async share-outs + design crits.
Referenced content audit and competitor benchmarks for CTA placement.
Collaborated closely with Eng to avoid blockers.

Testing

Ran usability sessions to validate navigation clarity, messaging, and CTA hierarchy.

Conducted 13 sessions across 3 enterprise personas (admins, champions, leaders).
Asked targeted questions on comprehension + recall of key value props.
Refined flows and microcopy based on user feedback.

Launch

QA’d with PM and Eng to preserve SEO and accessibility before rollout.

Validated consistency with Atlassian’s design system.
Documented specs for engineering QA and SEO compliance.
Rolled out redesign across multiple platform pages in 13 weeks.

01
02
03

Audit

Ran a workshop with PM to map key personas and research goals. Defined 3 enterprise personas (admins, champions, leaders) to guide design decisions.

Wireframes

Created low–mid–high fidelity wireframes with iterative feedback cycles. Partnered with content design to restructure hierarchy and CTA placement, referencing audit + benchmarks.

Testing

Conducted 13 usability sessions across personas. Validated nav clarity, value comprehension, and CTA recall. Refined messaging and flow based on feedback.

01
02
03

Audit

Ran a workshop with PM to map key personas and research goals. Defined 3 enterprise personas (admins, champions, leaders) to guide design decisions.

Wireframes

Created low–mid–high fidelity wireframes with iterative feedback cycles. Partnered with content design to restructure hierarchy and CTA placement, referencing audit + benchmarks.

Testing with users

Conducted 13 usability sessions across personas. Validated nav clarity, value comprehension, and CTA recall. Refined messaging and flow based on feedback.

New users weren’t seeing Jira’s value before signup—static tours buried the story, driving high drop-offs.

A past interactive test had boosted signups by ~10%. Building on that, the challenge became: could we design a hands-on demo that let users try Jira before committing?

Project Constraints

While the idea was ambitious, the project had to run as time-boxed growth experiment. That meant keeping scope lean, working within Atlassian's existing design system, and coordinating closely with PM and Engineering to move quickly without breaking quality.

Research-heavy

Had to validate with interviews + heatmaps before changes.

Enterprise alignment

Multiple stakeholders (PM, Eng, Marketing) to reconcile.

13-week scope

Limited sprint cycle to deliver redesign end-to-end.

Design Process

Audit

Reviewed analytics, heatmaps, and competitor benchmarks to identify key pain points.

Analyzed user behavior, scroll depth, and conversion funnels (80% bounce).
Benchmarked competitor pages and ran a content audit.
Identified 3 critical issues: unclear value proposition, poor navigation hierarchy, weak CTA placement.

Wireframes

Partnered with content design to restructure hierarchy and clarify value props.

Created low → high fidelity wireframes, iterating weekly with async share-outs + design crits.
Referenced content audit and competitor benchmarks for CTA placement.
Collaborated closely with Eng to avoid blockers.

Testing

Ran usability sessions to validate navigation clarity, messaging, and CTA hierarchy.

Conducted 13 sessions across 3 enterprise personas (admins, champions, leaders).
Asked targeted questions on comprehension + recall of key value props.
Refined flows and microcopy based on user feedback.

Launch

QA’d with PM and Eng to preserve SEO and accessibility before rollout.

Validated consistency with Atlassian’s design system.
Documented specs for engineering QA and SEO compliance.
Rolled out redesign across multiple platform pages in 13 weeks.

Audit

Ran a workshop with PM to map key personas and research goals. Defined 3 enterprise personas (admins, champions, leaders) to guide design decisions.

Wireframes

Created low–mid–high fidelity wireframes with iterative feedback cycles. Partnered with content design to restructure hierarchy and CTA placement, referencing audit + benchmarks.

Testing

Conducted 13 usability sessions across personas. Validated nav clarity, value comprehension, and CTA recall. Refined messaging and flow based on feedback.

Design Process

Final Design

Lessons Learned

Challenges:

  • Despite strong engagement, the demo did not reach the 5% signup goal.

  • Key obstacles included targeting a broad audience and the demo’s low-visibility placement.

Target Audience:

  • Challenge: The experiment targeted all users visiting the product tour, but 30% were already logged in and not in the target demographic.

  • Lesson: Narrow the target audience to new users (N2N) for future experiments.


  • Visibility:

    • Challenge: The demo was placed below the fold, with only 37% of users scrolling to view it.

    • Lesson: Ensure interactive elements are visible above the fold to maximize exposure.


  • Data Dilution:

    • Challenge: The exposure event fired at page load, including users who never interacted with the demo.

    • Lesson: Redefine exposure criteria to reflect actual interaction.

Final Design

Lessons Learned

Challenges:

  • Despite strong engagement, the demo did not reach the 5% signup goal.

  • Key obstacles included targeting a broad audience and the demo’s low-visibility placement.

Target Audience:

  • Challenge: The experiment targeted all users visiting the product tour, but 30% were already logged in and not in the target demographic.

  • Lesson: Narrow the target audience to new users (N2N) for future experiments.


  • Visibility:

    • Challenge: The demo was placed below the fold, with only 37% of users scrolling to view it.

    • Lesson: Ensure interactive elements are visible above the fold to maximize exposure.


  • Data Dilution:

    • Challenge: The exposure event fired at page load, including users who never interacted with the demo.

    • Lesson: Redefine exposure criteria to reflect actual interaction.

Final Design

These screens highlight the opening steps of the demo—showing how users were guided into Jira’s board and issue features.
While only the first steps are shown here, the full demo included complete task flows that generated the usage data and insights we analyzed.

What Changed (and Why It Mattered)

Here’s how research translated into high-impact updates—showing the before/after shift for each area.

Design Improvements

Clearer Conversion Paths

Before image
Click to toggle

CTAs were restructured to guide users toward high-value actions. Placement was informed by heatmaps and conversion goals, ensuring “Contact Sales” and signup options were always visible at the right moments.

Before

  • Only one CTA (newsletter signup)
  • Contact Sales” option missing
  • Text links easy to overlook

After

  • Persistent CTAs at top + bottom
  • Bold, accessible buttons
  • Placement guided by heatmaps

Impact

Redesign Results

Outcome: After 14 days, rates remained flat—but the experiment surfaced critical insights to guide our next optimizations.

Impact

Redesign Results

Outcome: After 14 days, rates remained flat—but the experiment surfaced critical insights to guide our next optimizations.

What Changed (and Why It Mattered)

Here’s how research translated into high-impact updates—showing the before/after shift for each area.

Design Improvements

Clearer Conversion Paths

Before image
Click to toggle

CTAs were restructured to guide users toward high-value actions. Placement was informed by heatmaps and conversion goals, ensuring “Contact Sales” and signup options were always visible at the right moments.

Before

  • Only one CTA (newsletter signup)
  • Contact Sales” option missing
  • Text links easy to overlook

After

  • Persistent CTAs at top + bottom
  • Bold, accessible buttons
  • Placement guided by heatmaps

Final Design

Lessons Learned

Challenges:

  • Despite strong engagement, the demo did not reach the 5% signup goal.

  • Key obstacles included targeting a broad audience and the demo’s low-visibility placement.

Target Audience:

  • Challenge: The experiment targeted all users visiting the product tour, but 30% were already logged in and not in the target demographic.

  • Lesson: Narrow the target audience to new users (N2N) for future experiments.


  • Visibility:

    • Challenge: The demo was placed below the fold, with only 37% of users scrolling to view it.

    • Lesson: Ensure interactive elements are visible above the fold to maximize exposure.


  • Data Dilution:

    • Challenge: The exposure event fired at page load, including users who never interacted with the demo.

    • Lesson: Redefine exposure criteria to reflect actual interaction.

  • More Works More Works

  • More Works More Works

bystrom

bystrom

bystrom

bystrom

bildkritik

Go Back To Top

bildkritik

Go Back To Top