SupplyScreen

UX Research & design

View Prototype
SupplyScreen header image

Introduction

On SupplyShift’s platform, our primary users, Buyers, send questionnaires called Assessments to their Suppliers. These Suppliers, in turn, answer the Assessments to the best of their ability and provide information that increases transparency between Buyer and Supplier. However, this process can take quite a while to implement, which meant that the time-to-value for our customers was substantially slower than we’d like it to be. The goal for this project was to provide a way for our customers to see an instant ESG (Environmental, Social, and Governmental) rating by simply adding a supplier to the platform by their name, substantially increasing the speed of their time-to-value.

Research

For this project, I wanted to make sure that I took enough time to interview stakeholders who were involved in the project. This was to ensure that I was gathering knowledge that every stakeholder had to be sure that I was seeing the bigger picture of the project from all vantage points possible. I also wanted to interview members of the implementation team, since they work so closely with our customers every day. This would help me to uncover the problems we were hoping to solve with this project.

Stakeholder & Internal Team Interviews

For this project, I conducted six one-on-one interviews, including project stakeholders and implementation team members. This was mostly for research, but also to ensure that all stakeholders were on the same page about the project direction. Some questions that I was hoping to answer were:

Affinity Map

Affinity map

Once I had conducted all the interviews, I went back through the recordings to take down notes, main points of concern, and any helpful quotes that might be useful when it came to communicating project direction. To find all common pain points and areas of concern, I created an affinity map of all collected notes.

Common problems that I found were:

Ideation

Project Goals

After the initial research was completed, I paused to reconnect with my Product Manager so that we could discuss project goals for moving forward. We came up with the following goals to keep in mind:

Create a way for users to quickly and easily receive a general ESG rating for any suppliers they added to the platform.
Ensure that the ratings were easy to understand.
Keep the MVP minimal so that we could move forward quickly with a launch.

"How Might We" Statements

With the common pain points identified and our goals in sight, I also formed some How Might We statements to help both myself and my product manager to focus on the most pressing areas.

How might we help Buyers to focus on the most important Suppliers to engage with first?
How might we help Buyers become more confident in their own benchmarking?
How might we inform Buyers about ESG ratings so that they understand the way in which it can be used?
How might we prevent the need for Buyers to leave the SupplyShift platform to engage with ESG ratings?

User Flows

After a rough roadmap had been decided on, I moved forward with creating some task and user flows for what the experience would be like. I made sure to check in with my product manager and members of the engineering team to ensure that the flows made sense to them and that everything was feasible to be built out.

A user flow showing a user uploading and receiving a supplier rating.A user flow showing a user filtering by the lowest ESG rating.

Design

Low-Fidelity Wireframes

I then started working on low-fidelity wireframes in Figma. There were different ways I thought about presenting the ratings, so I wanted to try different layouts to see what made sense to my product manager, the engineers, and our stakeholders. We also wanted to think about future iterations where we allowed our users to dig even deeper into the data than the initial ratings.

Low-fidelity wireframe of SupplyScreen risk assessment.Low-fidelity wireframes of SupplyScreen risk assessment.Low-fidelity wireframes of SupplyScreen risk assessment with the filtering panel open.

High-Fidelity Wireframes

Ultimately, we decided to move forward with a layout that split the ESG ratings into categories so that it was a bit clearer to our users in which areas the suppliers might struggle or excel. With that in mind, I moved into creating the high-fidelity wireframes so that I could start showing the work for testing.

A list of supplier scores.

Because the engineers were unsure how long it would take a new score to load, I wanted to make sure to include a loading state for a "retrieving" score.

A list of supplier scores.

It was important that our users be able to use this data flexibly, so I made sure to include the ability to sort by score in each column.

A list of supplier scores. Shows an expanded filtering panel.

Being able to filter the scores went without saying, so I made the filtering flexible in each column category. In order to kee the scores understandable, I labeled the score buckets with labels like "high risk" and "low risk."

Testing

Testing Plans

Because we wanted to move quickly, I conducted three usability tests with the high fidelity prototype with our implementation team members. My primary goal was to make sure that the layout made sense and was relatively easy to understand at first glance. The tasks for each test were:

Testing Results

The tests were run successfully and the testers had no issues with completing the tasks as requested. I did receive feedback that finding more information about the scores was a bit of a hassle when it was hidden in the filter sidebar, so I moved into some small revisions.

Revisions

While working on the suggested revision by my testers, I also received a request by my Product Manager to break  down the design work into smaller pieces for shorter bursts of engineering effort. In the end, this was helpful when it came to finding a new place in the design for the informational modal link. The first phase of the SupplyScreen broke out the filtering system for further tickets down the road, opening up space on the main page for a link that was easily spotted by later testers.

I also decided to add some clarifying tooltips for users when they hovered on the "retrieving" status of a score.

A list of supplier scores. Shows an arrow hovering over a "retrieving score" revealing a tooltip.