Atlassian Platform
Atlassian Platform
I Redesigned Atlassian’s platform webpage to boost user experience and engagement. I addressed high bounce rates, low conversion rates, and user confusion through data-driven insights and user research and delivered a more intuitive, visually striking, and action-focused interface.
Product
Product
What I did
product design
My role
design lead
Time line



Impact
Impact
The redesign delivered measurable success:
$662K in sales pipeline influence.
80% enterprise customer response rate,
doubling the 45% goal.Reduced bounce rate by 40% from from 80% to 40%.
The redesign delivered measurable success:
$662K in sales pipeline influence.
80% enterprise customer response rate,
doubling the 45% goal.Reduced bounce rate by 40% from from 80% to 40%.
Impact
The redesign delivered measurable success:
$662K in sales pipeline influence.
80% enterprise customer response rate,
doubling the 45% goal.Reduced bounce rate by 40% from from 80% to 40%.
Context
Context
The existing platform webpages struggled to engage enterprise customers effectively. It lacked clarity about the platform’s value, featured unintuitive navigation, and failed to provide clear user actions, resulting in:
High bounce rates (80%).
Low conversions.
With the Team 23 conference deadline looming, where new products would be revealed, this redesign became a time-sensitive, high-priority initiative.
The existing platform webpages struggled to engage enterprise customers effectively. It lacked clarity about the platform’s value, featured unintuitive navigation, and failed to provide clear user actions, resulting in:
High bounce rates (80%).
Low conversions.
With the Team 23 conference deadline looming, where new products would be revealed, this redesign became a time-sensitive, high-priority initiative.
Context
The existing platform webpages struggled to engage enterprise customers effectively. It lacked clarity about the platform’s value, featured unintuitive navigation, and failed to provide clear user actions, resulting in:
High bounce rates (80%).
Low conversions.
With the Team 23 conference deadline looming, where new products would be revealed, this redesign became a time-sensitive, high-priority initiative.
What we learned
What we learned
Key Findings
Research revealed three critical pain points that hindered user engagement and conversion:
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Key Findings
Research revealed three critical pain points that hindered user engagement and conversion:
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
What we learned
Key Findings
Research revealed three critical pain points that hindered user engagement and conversion:
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Target Audience:
Challenge: The experiment targeted all users visiting the product tour, 30% were already logged in and not in the target demographic.
Lesson: Narrow the target audience to new to new users (N2N) for future experiments.
Experiment Visibility:
Challenge: The demo was placed below the fold, with only 37% of users scrolling
to view it.Lesson: Ensure interactive elements are visible above the fold to maximize exposure.
Data Dilution:
Challenge: The exposure event fired at page load, including users who never interacted with the demo.
Lesson: Redefine exposure criteria to reflect actual interaction.
Process
Process
I used a user-centered, data-driven design process to better understand the needs of users.
Quantitative Analysis:
Heatmaps, scroll depth, click rates, and bounce rates were analyzed to identify user behavior patterns.
Assumption Mapping Workshop:
Collaborated with PM & stakeholders to align on goals and prioritize research questions.
User Interviews and Usability Testing:
Conducted interviews with 13 participants across three personas: Enterprise Admins, Champions, and Leaders.
Tested the existing webpage for navigation, content clarity, and actionability.
Data Analysis:
Analyzed A/B testing results and heatmap data from the previous timeline demo to understand user behaviors and drop-off points.
Identified areas for improvement, such as visibility and step completion rates.
Workshops:
Facilitated a Crazy Eights workshop with PMs, engineers, and stakeholders to generate ideas and align on the demo's user flow.
Prioritized simplicity and engagement to maximize impact.
Design and Development:
Built a streamlined version of Jira Software, focused on board and issue features.
Added content-populated guiding cards and interactive elements (editable fields, pre-filled content) to create an engaging, step-by-step demo experience.
Data Analysis:
Analyzed A/B testing results and heatmap data from the previous timeline demo to understand user behaviors and drop-off points.
Identified areas for improvement, such as visibility and step completion rates.
Workshops:
Facilitated a Crazy Eights workshop with PMs, engineers, and stakeholders to generate ideas and align on the demo's user flow.
Prioritized simplicity and engagement to maximize impact.
Design and Development:
Built a streamlined version of Jira Software, focused on board and issue features.
Added content-populated guiding cards and interactive elements (editable fields, pre-filled content) to create an engaging, step-by-step demo experience.
Data Analysis:
Analyzed A/B testing results and heatmap data from the previous timeline demo to understand user behaviors and drop-off points.
Identified areas for improvement, such as visibility and step completion rates.
Workshops:
Facilitated a Crazy Eights workshop with PMs, engineers, and stakeholders to generate ideas and align on the demo's user flow.
Prioritized simplicity and engagement to maximize impact.
Design and Development:
Built a streamlined version of Jira Software, focused on board and issue features.
Added content-populated guiding cards and interactive elements (editable fields, pre-filled content) to create an engaging, step-by-step demo experience.
Data Analysis:
Analyzed A/B testing results and heatmap data from the previous timeline demo to understand user behaviors and drop-off points.
Identified areas for improvement, such as visibility and step completion rates.
Workshops:
Facilitated a Crazy Eights workshop with PMs, engineers, and stakeholders to generate ideas and align on the demo's user flow.
Prioritized simplicity and engagement to maximize impact.
Design and Development:
Built a streamlined version of Jira Software, focused on board and issue features.
Added content-populated guiding cards and interactive elements (editable fields, pre-filled content) to create an engaging, step-by-step demo experience.
I used a user-centered, data-driven design process to better understand the needs of users.
Quantitative Analysis:
Heatmaps, scroll depth, click rates, and bounce rates were analyzed to identify user behavior patterns.
Assumption Mapping Workshop:
Collaborated with PM & stakeholders to align on goals and prioritize research questions.
User Interviews and Usability Testing:
Conducted interviews with 13 participants across three personas: Enterprise Admins, Champions, and Leaders.
Tested the existing webpage for navigation, content clarity, and actionability.
Data Analysis:
Analyzed A/B testing results and heatmap data from the previous timeline demo to understand user behaviors and drop-off points.
Identified areas for improvement, such as visibility and step completion rates.
Workshops:
Facilitated a Crazy Eights workshop with PMs, engineers, and stakeholders to generate ideas and align on the demo's user flow.
Prioritized simplicity and engagement to maximize impact.
Design and Development:
Built a streamlined version of Jira Software, focused on board and issue features.
Added content-populated guiding cards and interactive elements (editable fields, pre-filled content) to create an engaging, step-by-step demo experience.
Data Analysis:
Analyzed A/B testing results and heatmap data from the previous timeline demo to understand user behaviors and drop-off points.
Identified areas for improvement, such as visibility and step completion rates.
Workshops:
Facilitated a Crazy Eights workshop with PMs, engineers, and stakeholders to generate ideas and align on the demo's user flow.
Prioritized simplicity and engagement to maximize impact.
Design and Development:
Built a streamlined version of Jira Software, focused on board and issue features.
Added content-populated guiding cards and interactive elements (editable fields, pre-filled content) to create an engaging, step-by-step demo experience.
Data Analysis:
Analyzed A/B testing results and heatmap data from the previous timeline demo to understand user behaviors and drop-off points.
Identified areas for improvement, such as visibility and step completion rates.
Workshops:
Facilitated a Crazy Eights workshop with PMs, engineers, and stakeholders to generate ideas and align on the demo's user flow.
Prioritized simplicity and engagement to maximize impact.
Design and Development:
Built a streamlined version of Jira Software, focused on board and issue features.
Added content-populated guiding cards and interactive elements (editable fields, pre-filled content) to create an engaging, step-by-step demo experience.
Data Analysis:
Analyzed A/B testing results and heatmap data from the previous timeline demo to understand user behaviors and drop-off points.
Identified areas for improvement, such as visibility and step completion rates.
Workshops:
Facilitated a Crazy Eights workshop with PMs, engineers, and stakeholders to generate ideas and align on the demo's user flow.
Prioritized simplicity and engagement to maximize impact.
Design and Development:
Built a streamlined version of Jira Software, focused on board and issue features.
Added content-populated guiding cards and interactive elements (editable fields, pre-filled content) to create an engaging, step-by-step demo experience.
Process
I used a user-centered, data-driven design process to better understand the needs of users.
Quantitative Analysis:
Heatmaps, scroll depth, click rates, and bounce rates were analyzed to identify user behavior patterns.
Assumption Mapping Workshop:
Collaborated with PM & stakeholders to align on goals and prioritize research questions.
User Interviews and Usability Testing:
Conducted interviews with 13 participants across three personas: Enterprise Admins, Champions, and Leaders.
Tested the existing webpage for navigation, content clarity, and actionability.
Data Analysis:
Analyzed A/B testing results and heatmap data from the previous timeline demo to understand user behaviors and drop-off points.
Identified areas for improvement, such as visibility and step completion rates.
Workshops:
Facilitated a Crazy Eights workshop with PMs, engineers, and stakeholders to generate ideas and align on the demo's user flow.
Prioritized simplicity and engagement to maximize impact.
Design and Development:
Built a streamlined version of Jira Software, focused on board and issue features.
Added content-populated guiding cards and interactive elements (editable fields, pre-filled content) to create an engaging, step-by-step demo experience.
Data Analysis:
Analyzed A/B testing results and heatmap data from the previous timeline demo to understand user behaviors and drop-off points.
Identified areas for improvement, such as visibility and step completion rates.
Workshops:
Facilitated a Crazy Eights workshop with PMs, engineers, and stakeholders to generate ideas and align on the demo's user flow.
Prioritized simplicity and engagement to maximize impact.
Design and Development:
Built a streamlined version of Jira Software, focused on board and issue features.
Added content-populated guiding cards and interactive elements (editable fields, pre-filled content) to create an engaging, step-by-step demo experience.
Data Analysis:
Analyzed A/B testing results and heatmap data from the previous timeline demo to understand user behaviors and drop-off points.
Identified areas for improvement, such as visibility and step completion rates.
Workshops:
Facilitated a Crazy Eights workshop with PMs, engineers, and stakeholders to generate ideas and align on the demo's user flow.
Prioritized simplicity and engagement to maximize impact.
Design and Development:
Built a streamlined version of Jira Software, focused on board and issue features.
Added content-populated guiding cards and interactive elements (editable fields, pre-filled content) to create an engaging, step-by-step demo experience.
Data Analysis:
Analyzed A/B testing results and heatmap data from the previous timeline demo to understand user behaviors and drop-off points.
Identified areas for improvement, such as visibility and step completion rates.
Workshops:
Facilitated a Crazy Eights workshop with PMs, engineers, and stakeholders to generate ideas and align on the demo's user flow.
Prioritized simplicity and engagement to maximize impact.
Design and Development:
Built a streamlined version of Jira Software, focused on board and issue features.
Added content-populated guiding cards and interactive elements (editable fields, pre-filled content) to create an engaging, step-by-step demo experience.



Final Design
Key Improvements:
Show Platform Value:
Added interactive visuals, including an updated platform diagram with cross-linking opportunities and in-product videos.
Integrated social proof with logos, quotes, and a carousel of customer case studies.
Define Clear Actions:
Streamlined the user journey with clear CTAs, placing “Contact Sales” prominently at the top and bottom of each page.
Simplified navigation paths to reduce friction.
Final Design
Key Improvements:
Show Platform Value:
Added interactive visuals, including an updated platform diagram with cross-linking opportunities and in-product videos.
Integrated social proof with logos, quotes, and a carousel of customer case studies.
Define Clear Actions:
Streamlined the user journey with clear CTAs, placing “Contact Sales” prominently at the top and bottom of each page.
Simplified navigation paths to reduce friction.
Final Design
Key Improvements:
Show Platform Value:
Added interactive visuals, including an updated platform diagram with cross-linking opportunities and in-product videos.
Integrated social proof with logos, quotes, and a carousel of customer case studies.
Define Clear Actions:
Streamlined the user journey with clear CTAs, placing “Contact Sales” prominently at the top and bottom of each page.
Simplified navigation paths to reduce friction.
Final Design
Key Improvements:
Show Platform Value:
Added interactive visuals, including an updated platform diagram with cross-linking opportunities and in-product videos.
Integrated social proof with logos, quotes, and a carousel of customer case studies.
Define Clear Actions:
Streamlined the user journey with clear CTAs, placing “Contact Sales” prominently at the top and bottom of each page.
Simplified navigation paths to reduce friction.
More Works More Works
More Works More Works

Jira Software
Interactive Experiement
2023
2023

Jira Software
Interactive Experiement
2023
2023

Jira Software
Interactive Experiement
2023
2023

Jira Software
Interactive Experiement
2023
2023

Jira Software
Product Experiment
2024
2024

Jira Software
Product Experiment
2024
2024

Jira Software
Product Experiment
2024
2024

Jira Software
Product Experiment
2024
2024
bildkritik
Go Back To Top
bildkritik
Go Back To Top