User Design Validation and Testing at Bonzai Intranet [Interview]

User experience validation is one of the most important components of UX design. However, in the modern world of design, we often hear "don't validate design, test them." 

We sat down with Mayra Pulido, our in-house UX Specialist, to discuss the topic of user design and share key insights into running the latest design test with some of Bonzai Intranet’s clients.

Find out more about the different approaches to user design validation and testing and how Bonzai goes about such process.

Cameron: "Mayra, and I just finished a project where we validated user designs, and we did a bit of user testing. I want to tell everybody how we came about this project because I find it very cool and insightful. How do you go about deciding to validate a design?”

Pulido: “Three months ago, we were working on a new release for Bonzai. We had a couple of features that we wanted to test. I think in total we had six of them. Three of those we had already tested previously.

So, we wanted to do specific validation for those designs to see if the updates we had for those designs were on the right path for the second release. We did a couple of exercises that involved surveys and user-testing tasks, and you helped me to contact the people."

Cameron: "Yeah. That’s right. Mayra reached out to me after we did a little bit of prep work, and we had to some assumptions, which we then needed to test by reaching out to people.

I reached out to the Project Managers, the PMs gave me a list of very opinionated clients, exactly what we were looking for, and I just reached out to them, offering a gift card as well as a little sneak peek of our new release, and the chance to have their voices heard in the product.

We had a big response. I think we had about 10 people join for the user test, taking about an hour in length. We did some in-user or in-person user test, but in order to reach more of a global audience this time, we did remote user testing, which was interesting."

In-person or Remote User Testing

Cameron: "Mayra, what do you think between the in-person user testing and the remote testing, what do you find that are different and similar about the two?"

Pulido: "With the in-person ones, you get to have more engagement with the people at the beginning. You have the chance to have a conversation face-to-face. It can help the people that you’re testing with to be more open or feel more comfortable, versus the user test where you don’t see their faces. You really don’t see their reactions, and you’re just trying to go with the flow on what they’re saying, and all the things that they’re talking.

But I feel in our case, our product is for the enterprise, so it’s really hard for us to collect data without permission, and things like that. There’s always, like you said, people who are very opinionated and willing to give feedback.

I feel remote user testing helps us reach more people than the face-to-face. We also used Zoom by the way to do all those user tests. It was a really an easy tool where we are able to record the sessions and use screen-sharing and everything."

Cameron: "Yeah. It was really nice to talk to clients that we don’t normally get to talk to. Last time, we did a lot of local clients, which was good because we got to see them in their space. But this time, it gave us larger organizations or some of the clients in the States that had completely different challenges and problems."

What Did We Learn From User Testing?

Cameron: "One of the things that was really cool for me was that we did notice, we do have that Neilson Norman after five people they all start saying the same thing, and it really validated that because we were starting to hear the same things surface up and over again.

What were some of the things that jumped out at you that you found were really insightful from this round of user testing?"

Pulido: "We had a lot of those. Of course, sometimes as a designer you get your assumptions, thinking the user is going to react in one way, but it turns out to be completely different of what you had initially thought. This is something that I really enjoy from the user tests.

More than anything, the user tests were mostly qualitative information. It was more the users’ opinions, so I was very interested in hearing how they feel when they were testing this or the “wows or the oh I love this”. Basically, all these reactions that we got on the test when we were doing the first impressions and all that.

I think that was one of my favorite things, and I didn’t expect to get those kinds of “wows or, oh I love this. This is very cool.” I feel there is something really rewarding to our job. After three months of working on that, then showing it to people, and then getting this reaction about your product is really rewarding."

The nitty-gritty of User Testing

Pulido: "Do you want to talk about how we gathered the data after that we did positives, negatives and improvements? The negatives weren’t that negatives."

Cameron: "Yeah, but we had to surface them out to get some suggestions. Now, I think on that topic a lot of people do this. I think you probably see this in user testing, but how Mayra and I went about collecting this data, and as we were saying before it was very qualitative data.

There wasn’t a lot of straight numbers. We had to re-listen to the recordings a lot, pull out the positives, and the negatives, and as you said not necessarily negatives, but just where they struggled or where they weren’t quite hitting where we wanted them to.

We took all those, wrote them out on sticky notes, put them all around on the board, and then collected them and prioritized them based by feature."

Pulido: "Like you said, we started getting the same reviews and comments after the first five or so. We were getting pretty much the same comments about icons or specific images that we were using on the website.

I feel when doing user tests, you have to know what to ask. You have to make sure that the questions are constructed in a way to reveal what you want to get out of the test. In this case, we got a clear pattern of where the users were having issues.

In this way, it makes it super easy for the next round when we want to do other user tests or when we want to improve the design. It helps us with all these issues that we found, so we can do another round of designs, and then validate again."

Cameron: "Yeah. I think it’s perfect because they were really starting to benchmark ourselves. We did the first round of user testing, and now we’ve done the second one, and we were starting to really find our groove.

I feel nowhere so more than when we actually presented it this time. We did a little bit of a mix where I pulled, and we aggregated all the numbers. I pulled them into Tableau, and we were able to aggregate, and then surface the comments, and then present that to the team.

Do you want to tell us about how you came to the findings in the presentation?"

Pulido: "The purpose of the presentation was to show where people are struggling in the product. When we were preparing the presentation, I realized that there were a couple of red flags in the data, coming across as something negative or bad feedback but it’s not."

Cameron: "No."

Pulido: "I gathered all the best moments of all user tested created experiences and showed them to the product team at the end. When they were looking at the data, they thought “Oh man, a lot of people struggling with this and that.” But then at the end, it was a lot of “Wows. This is so cool.” Even though they struggled a little bit, the outcome was very positive."

Cameron: "Yeah. I think that was a huge part of the feedback is that people were like, “You know what? There’s these things that people are struggling with.” But then at the end, you were like, “You realize that you’re making a great product, and it makes it really exciting for everybody.

Yeah. I’d say good job on that."

Pulido: "Good job to you too gathering all that data."

Cameron: "Yes. Thank you. That was exciting."

Pulido: "Yeah. I was reading the other day about validation on design, and they were saying that you shouldn’t validate in order to congratulate yourself as a designer. Instead of saying validating your design, you should say user testing, right?"

Cameron: "Yeah."

Pulido: "And don’t worry about all the negatives that you get. There is no user test that is always going to come positive. If you ever do a user test, and everything came out to be perfect, and there were no changes to do, the only finding that you got from that test is that you have to do another test because the one that you did wasn’t right.

There’s always going to be something or people are always going to have an opinion on your design. Don’t focus too much on individual feedback or what one person said.

We were always focusing on all that we got the most feedback on, taking all the feedback into consideration. But if you got one negative about one specific element, don’t get sad or don’t get too frustrated. The focus should always be on the big picture."

Cameron: "I think that’s great advice, and I really look forward to doing more user tests."

Final Thought

Choosing how to approach user design validation and testing is so important toward making a great intranet user experience. Often times, user design validation starts after the development process or when most of the work has been done for a new release for example.

Here at Bonzai, however, we think about user experience early-on in the development process, making sure user experience is of primary consideration. This way it's easier, faster, and cheaper to make design changes.

Still have questions? Please comment below and we are happy to discuss further.

It’s Time To Transform

Let us show you how much easier your work life can be with Bonzai Intranet on your team.