Hello, ladies and gentlemen, and welcome to the UI Breakfast podcast. I'm your host, Jane Portman, and today our awesome guest is Jason Giles, VP of Product Design at User Testing, and we're going to talk about user-centered decision-making today.
This episode is brought to you by Wix Studio, the new web platform for agencies and enterprises. The magic of Wix Studio is its advanced design capabilities, which makes website creation efficient and intuitive. Here are a few things you can do. Work in sync with your team on one canvas. Reuse templates, widgets, and sections across sites.
create a client kit for seamless handovers, and leverage best-in-class SEO defaults across all your Wix sites. Step into Wix Studio to see more at wix.com slash studio. Hi, Jason. Hey, Jane. Thanks for having me. We're honored to have you here today. Thanks for joining. I am looking forward to our conversation.
So you're VP of product design at the most famous user testing company in the world called User Testing. Great start. You've been with them for five years and you know everything about building cultures. But before we dive into that, tell us more about the full story, your magnificent rise at Microsoft and what happens next. Wow. Okay. Yeah, totally. Yeah.
So first off, I'm really old. So I'm going to have to take you all the way back to the mid-90s where I was an electronic musician in Seattle trying to create music during the grunge days when these Pearl Jams and Nirvanas were out there. And what happened at the same time was that this internet thing came. Since we were already on computers, we were like, man, this internet thing combined with the music could be really cool. So we started experimenting with...
online collaboration and building web pages and just kind of figuring it out as we went. Unfortunately, at a certain point, I realized I wasn't going to be a professional musician. Our band wasn't going to take off. I was like, I need a different job. And at the time I was in there, Seattle,
And Microsoft was really interested in bringing in people who knew anything about the internet, how it could be applied. Because I had all this experience, just because of my curiosity and trying and doing different things, I was able to get my foot in the door with an interview and talk about some of the work that I had done and
They were willing to take a chance on this purple haired, pierced dude. And it was great. You want to talk about imposter syndrome. Oh my gosh. I remember going in, you know, my first days and just looking over my shoulder, expecting to be, hey, this kid doesn't know what he's doing. Get him out of here. But what was super cool was this first team that I joined, I was on a small design team and
who sat right next to me was this guy who was, he had a title, human factors engineer. Like, what the heck is this guy doing here? I don't understand. And
And what we later found out was that this was the beginnings of user research, usability, engineering, all of that. And I didn't realize it at the time, but I was spoiled to just be dropped into the early days of what became user experience. And so I kind of tell people I grew up at Microsoft and I've been able to grow up with the UX discipline.
And that ended up spanning a career at Microsoft over 15 years, which was awesome. I started out at a certain point. I got over my imposter syndrome and I moved into more kind of leadership roles, design management, and was pretty successful. I kind of specialized on working with
teams that had a lot of friction and maybe some kind of outdated practices. You have to remember that back then, it was very difficult for design to have a seat at the table. And so there was this constant kind of mission, if you will, was like, get more visibility, impact and influence the product more. And
I was there for that whole ride and managing teams through that and how to get kind of equality or a voice at the table to influence more deeply. So that let me work on products like early versions of Office 365 and a bunch of
B2B things. I worked with Microsoft Research for a while and doing some of the machine vision work that they were doing. And then finally was leading the design team on the software side for Xbox.
which at Microsoft is pretty cool. Like that's like, Oh, that's a known brand, fun audience. So I was pretty proud, did the last version of Xbox 360 and we brought the first version of Xbox one to the world, which was just an honor to work with just incredibly insanely talented people and be inspired every day. And to figure out how do you take all these really talented and creative people and,
get them to work together on a common objective and build something that, you know, we're proud of. So that was really cool. I did a couple versions of that at Microsoft. And then, you know, I was pretty successful. I was doing good. But I started asking myself, you know, is my success only because I figured out how to build teams and design products at Microsoft?
And so I kind of had to prove to myself that that wasn't the case. And so I had an opportunity to join a friend down in Los Angeles and build a team at DirecTV. And it was great. It was like my first experience. It was awesome because I had this trusted partner and he was kind of a more of a brand guy.
And he's like, Hey, you know, you know, product, like you run that department. And together over, I'd say three years, we went from this small scrappy team where the admin was doing like kind of usability tests to really blowing out a really professional design program. We built usability labs and we had a design strategy practice that did internal consulting in the business. And
It was something I felt really proud of. And so that took me about three years to kind of get that practice going. And I'm like, well, let's do that again. So then I went and I went to Ticketmaster and kind of turned the crank on that again. And this time with a more focus, like in this area, it was, you know, how do I convince the company that
and the leaders of the company to think more about the customer. So a lot of my time, even though I was still building a team and was still working on products, it was convincing them that the voice of the customer really matters. And it's not like they didn't, but they needed tangible proof and belief to invest more in doing that kind of user research and customer-focused practices.
Again, awesome experience, cool space, concerts and sports and all that. But as I was thinking about what to do next, I reflected a little bit and found that as I've been working for smaller and smaller companies, I'm having more and more fun. And for me, I was like, well, what if I went really small? And it just so happened that this company, User Testing,
who I was a customer of and I knew very well as a customer.
They were looking for somebody to help them really scale and grow as a company. So I took a risk. I think for me, it was nice talking to a CEO as part of my interview, and he's asking me about my thoughts on how to enable more customer-focused design. And that's just crazy. Usually, I spend my first year trying to convince an executive team of why
you know, customer informed decisions matter. So that was kind of a treat. It was kind of unique. So that was five years ago. And I've been working with them various, various versions of the product and
focusing on not only the products that we build, but how we build them and what type of culture does it take and ways of working does it take to make sure that as we're making all these decisions about what should we build and how should we build them and are they good enough to release, whether we're actually practicing what we're preaching. And I think along the way, I've done this in previous companies, but I realized that
You have to design a program that's appropriate for the context, for the company, for the level of risk that they're willing to take, for their commitment to investing in something like this.
And so five years later, multiple iterations of going through that, I feel pretty good about what we've stood up. And along the way, I had the opportunity to move to Edinburgh, Scotland. So I'm getting to work out of the office there and enjoy that experience of getting more of a firsthand appreciation for the cultural differences. Because we are a global company.
with offices in Australia and Singapore and Spain, Canada. And so it's just been a, it's been a wild ride, but it's been awesome.
Just for the context, you name a really small company. User testing five years ago is 300 people and 11 years old. Really small company. I know. It's all about your frame of reference. Remember, I came from Microsoft. But kudos. They're now 16 years old and 800 people. So you've been able to follow that amazing growth. Exactly. Wow.
Was it that they're hiring for this position when you joined or did you pitch yourself and was it some sort of informal deal that you joined? Yeah, no, they were, they were hiring. I think I actually saw a post like online that came through my feed and I reached out and
Very quickly, I was on a call with the CTO as he's driving his car. If you knew our CTO, you'd be like, oh, that doesn't surprise me. Yeah, just things moved very quickly, obviously. But, you know, they had aspirations there.
To start going global, really leaning into they had just opened this office in Scotland. So I had experience with managing international teams, which was a bonus. And I was bringing a lot of experience from, you know, very enterprise level practices, right?
And, you know, they had the aspirations of like, well, wait, we need to scale to that. And can you take us along for that ride? Can you help us figure out how to do that? And I must be doing okay because I'm still here five years later. We're going to talk about how to create a program in your business that helps people make product decisions based on user opinion, user input.
But I'd love to start with the opposite question, which is how is the opposite of that looking? What does it mean that the decisions are not made?
with the user in mind? Because obviously the rest of the company who are not designers, they're not the most stupid people in the world. They still want to drive profits. They want to build something useful. How come one is the opposite to the other? Yeah, I mean, I think there's a few reasons. We just did a survey and we found that, you know, two thirds of design projects don't have any research support. And more than half of product managers don't
say that they frequently have to just guess at what customers actually want. I don't think that that's because they want to be doing that. I think there's a lot of perception that it takes a lot of time.
To get that, like, typically, if you think about when you say user research, what this used to mean was that you would invest possibly hundreds of thousands of dollars to build a lab with all this fancy equipment and you pay people to come in and you'd get, you know, do all these moderated sessions and asking them questions or focus groups and.
And then you weeks of synthesizing the data and analyzing the reports. And by the time that that information is there, you've already made the decision. And so those kind of old traditional ways didn't keep up with the speed of business versus, you know, when I was at Ticketmaster, we would be, I remember this very distinctly, we were having a discussion about,
around a terminology or a name for a new feature that we were putting out there. And these are all senior stakeholders and we're sharing our opinions. Now, our opinions are informed by our experience. And we've been in the industry for a long time, but you're just going back and forth and back and forth. And I'm just getting fed up. So I take down the different options. And Ticketmaster office used to be right on Hollywood Boulevard.
So I'm like, hold that thought. I run out of the office. I start grabbing people off the street. Like, hey, tell me what you think of when you see this term. Tell me. In 20 minutes, I came back and I had a list of, look, these people actually were able to articulate what they thought this was. And that's actually what we're trying to provide.
And that is kind of the speed of decision-making. These decisions get made in boardrooms, and what software like user testing and others do is reduces that time. So it doesn't take weeks to get confidence about whether a product is good or not. It doesn't take...
days to find out, hey, does this name of a feature actually resonate with real people? We're talking about hours. And so I think that's one of the big barriers is there's this perception that it has to take a long time. Yeah.
I hope that your approach does not just entail just user testing and the big advertisement for your company, because there are other types of research, which also cataloging insights that you gather from support and interviews and demos and a gazillion of other ways. Totally. Absolutely. Thank goodness. Yeah.
So how do you know for different types of decisions, what types of research you need and how do you just, how do you think about this? What's the mindset?
This is where it does. It's helpful to have folks with experience around what is the best way to answer this question? And what are you going to do with this information? I'm going to make a big, risky decision about the business. I need high confidence. You're going to need a lot of numbers. So you're thinking about quantitative studies and surveys and things like that. I just want to know if this design that I'm kind of thinking about pursuing and pitching to the engineering team works.
is kind of make sense. Like, does this concept even make sense? That would be a different type of study that you might do. Going back to my example, a ticket master running out onto the street, we called that guerrilla testing.
You know, we used to go set up little setups in Starbucks or different copies just to get some eyeballs on our design. And that's okay for the right type of question that you're asking, the kind of information that you want to understand and how confident you want to be about that decision. It's one thing for me to go out in the Hollywood Boulevard, but it's the answer that I'm going to get there.
true in Bangladesh or Brazil. So again,
It's about the level of confidence kind of informs the different kind of strategy that you might use to get that. It's, of course, amazing if you have such experienced person on the team. Not everybody has that luxury. I'd love to hear some stories, if you have any, of decisions that were made at user testing or your previous companies and the type of research that was necessary to support them.
Yeah. And the ROI and the results, of course, of course, the sweet stories. Yeah, yeah, yeah. Boy, you're drawing from 20 years, 25 years of doing this. Some of my favorites at Microsoft for Xbox, we had what's called a golden path program, which we really want to understand the end to end experience. So this is a, what we call longitudinal study where it's over a lot of time where it,
goes out and it goes into people's homes and understands like what was your buying process or the unboxing process and then following what that same participants okay great you've just like used the the xbox like what was your first impressions were you able to get up and going and going over a
You know, all the way to, hey, you had a problem with your Xbox and you called support, like understanding what that experience was. You know, in the industry, we call that the kind of CX testing, like the full end to end experience. And that's a full, robust program that requires a lot of time and planning and thinking. And it was international. So you get really intense kind of tests like that.
Or when I first started at Ticketmaster, we had a bunch of concepts. We're like, hey, we want to improve the ticket buying experience. So literally, we drew out wireframes, drawings of what the experience could be like. We got a bunch of donuts. We went down into the cafeteria and we excluded anybody who was from the product team. So you've got the lawyers there. You've got the admins. And we offered donuts to kind of walk through with us.
and give us feedback. Like, what would you think you would click on here? You know, oh, I don't know what to click. Well, tell me about what you would expect to see. And so you basically, that would be a moderated wireframe click-through study. But, you know, intuitively, just that opportunity to like get some feedback and reinforce your ideas. And for those, you basically get designers when you ask about the impact, having more confidence, right?
to, with their design solution. You have less arguing over which way we should go anytime you can bring some evidence to the conversation. So it speeds up the time that we have these discussions about what should be built. And then hopefully once you start prototyping it and you can test those as well and you put it into market and you can do A-B testing and monitor those, like the intent is that you
deliver experiences that are not only delightful and easy to use, but are hitting the business metrics. So those are a couple different examples, but it's amazing. You know, like some of my funnest memories are us not having any budget, us not having any tools and having to get creative and being a designer, like,
What an awesome problem to solve. Like, how can we figure out a way to get some more customer voice included into the process? So you've mentioned multiple times that such program needs to be designed to fit the company.
So I'd love to hear what the typical components of such program include. Is it a set of SOPs, set of activities? Is it a rollout process, how you actually engage people in doing this? Yeah. Or a gazillion other things. I'd love to hear your overarching view. Yeah. I mean, in some ways, if you abstract it far enough, it's like any kind of program design. You know, you have to like do some of the basic things starting with,
Getting alignment from the top. Is this something that we want to invest in? Having those conversations around, look, we feel that we could be making better decisions as we build our campaigns or our products. Okay, do we agree on this? Do we understand it's going to take some investment? That might be money, but it also could be initial time up front to set it up. So, and getting clear on the goals. So, what do we expect?
Lots of companies have these values. Customers first. We love our customers. But, you know, is that part of how do they make that real? Like how do people within the company actually get a sense of how their customers, do our customers love us? And so that can tie into, you know, reasons why you want to invest in programs like this.
And sometimes that takes a long time for some companies. Maybe they're completely metrics driven. So it's like, here's how we find out. We build the product and then we instrument it. And then we do a bunch of A-B testing and we refine, refine, refine until it hits what we want. And that is fine. But imagine if you could reduce the number of experiments that you had to do before you actually invest in building the product.
And do both, right? So your experiments become more effective. You're reducing the number of options. Those kind of conversations depends on kind of the mindset of a company, right?
and the culture that exists. And so you need to get that alignment because it does require support. I found that the best programs have both a top-down, like you need to hear it from the top that, hey, we're investing in this. Here's why it's important. Here's what we want to accomplish from this, as well as from the ground up. So the first step is actually getting that alignment at the top.
which is exactly, you know, any other program is what you'd want to do. The next piece then is then thinking about the actual, the investment. So is it investing in people or do you need to bring in some specialists? Do you need to bring in some consultants to help think about this? Are there specific tools that you need to invest in? So again, at that leadership level, understanding that there's a kind of a commitment to this.
What I've done in the past is you also have to gauge if the commitment level isn't there, you have to start small. You have to do the guerrilla stuff. You have to start making a case for, hey, here's some lightweight testing that we did or here's, hey, we got some feedback.
And starting to share those successes and how it impacted, whether it was product decisions or some savings on, hey, we didn't run with this campaign. We favored this one because we got a lot of external feedback. And when it takes some time to build like, oh, you know, sometimes we call it the aha moment, like,
oh, like that makes so much sense why we would invest up front before we invest all these resources, whether it's driving a whole new campaign or investing in a new product feature. The other thing that you need to consider is there are different types of programs that you can run. So when it comes for research, we have conversations around your risk tolerance. So for instance,
I need to be 100% sure that any decision that we make, I have full confidence. And if that is the case, what you're going to want to do is do probably a highly structured program where there's lots of checks and balances. You're going to invest in professionals to do a lot of this kind of research activities.
versus, look, some user feedback and doing it more frequently is better than no user feedback. So what we're going to do is we're going to set some guardrails to ensure there's quality controls, but we're going to accept that sometimes a designer is going to go test a prototype and they asked a bunch of leading questions. They were a little biased. And so, you know, it's going to happen and you kind of have to have an understanding of
Where's your risk tolerance at Microsoft? Huge, robust program, very heavily like expert driven. Ticketmaster, a little bit more loose. And in the case of user testing, we actually err on the side of we think it's better if more decision makers have that direct feedback. So we're enabling our designers, our PMs, our marketers to
to directly go and ask customers questions because we think that those more touch points helps with actually achieving this sense of customer connection over time. I'm glad you brought up risks because that actually gets us closer to decision making.
the subject of our conversation today. The actual point in time where some human beings or the committee need to make a choice about features, about product directions, etc.,
What recommendations do you have and what school of thought shall we adopt when we have a bulk of research and a decision to make? And it's, I imagine, let's make it harder. It's not black and white. You don't have it like, you know, 90% here, 10% there in the results. Yeah. Yeah. No, I appreciate that. So one, the more that you get customer feedback, the more you actually develop intuition, right?
which helps to inform. So you can't, you, you can develop intuition and make decisions based on what I'd call customer informed intuition. And so even when it's not black and white, you have to make calls, right? This is business. Like you've got to like, I got to hit the ship button or not. I think where a success criteria for me within a, in a company is do people even stop to think like, okay,
Do do I have any customer input? You know, so much today is done just off the fly. And so when you start hearing questions, particularly from leaders like how confident are you in this? Like to me, that's a sign of like, oh, OK, we're starting to ask the right questions and we can have a conversation like, look,
We didn't do any research like I've got to just make an intuitive call. We've done the research. It was inconclusive. And so to move forward, we're going to we're going to move in this direction. But I think I guess for me, when I look at the number of decisions that get made versus the numbers that are really informed by some kind of customer touch point, I think we've still got a long, long way to go.
We have a bunch of features that originate from people raising problems, making feature requests, certain problems being visible in the sales calls, etc. So they're informed by nature. They're coming from the user. Is it enough to say that there's good confidence around those features? Or do they need to be another layer of testing on the outcome before rolling out? Which, of course, there should be. That's...
Yeah. Again, like how much risk you want to assume? Like if you've got a bunch of features, like where do you want to focus your energy on? Like you can do research to help you decide that. How you want to solve the solution. Hey, okay, when we build this feature, here's actually how it should behave. Probably a good idea to get some feedback along the way of whether that matches your customer's expectation. Is this feature good enough to ship?
That's another opportunity to be like, "Oh, let's just do another quick round of feedback and see, yeah, this is delighting customers." You don't have to do any of that, but you will be more confident in the likelihood that you're going to release a feature or a program that more maps to what your customer is expecting if you take those steps along the way.
As VP of product design at user testing, when you make decisions about new features, are you the one making those decisions or being involved? Or is there another layer? Or how does this decision making on the top level work in your company? No, I never make decisions in isolation. It's too dangerous. Bureaucracy. Yeah.
Yeah, I don't know if it's I think it's diversity of perspectives. Right. Like I come in really hard with the customer perspective and the experience perspective. And that's my role. I acknowledge that there's others that have a purely financial perspective or business perspective. And it's kind of so typically and how it works here at user testing is I'm part of the product team.
And while I represent the customer, it is a shared responsibility of what do we need to build both for our customers but also for the business? And we talk about those tradeoffs. And then when we build them, how do we execute against that? What's the quality? What do we agree is the quality level for releasing these features?
We also have a function that is fairly new by user testing, which is product strategy, which is really exciting because they're actually taking a much longer term look over like the three to five year time frame.
looking at emerging trends, looking at different markets and opportunities, and driving that with a ton of research as well. And I feel pretty excited about seeing that come to fruition as well. For a small company, again, what I consider a small company, I feel like that's a really awesome investment. I'd love to hear a bit more about what specific activities
that research entails. How do you think about trends?
How do you think about markets? Because it's not like you go read HubSpot surveys on the topic. That's not how you learn. So what do professionals do when they're given full-time tasks to predict the future, essentially? So again, we're starting small and we're focusing on questions that will have the most impact. And what that is right now is over the next three to five years, what...
capabilities, what markets is going to make us most successful in the market. So they've got right now, they've got a dedicated researcher. They've got, well, probably two researchers. I call them a data analyst.
And ahead of the program and what they're out there doing is they're looking at market research. They're doing their own primary research to understand different kinds of markets surveys. They're looking at the business information and the trends that are coming there. And they're basically trying to train triangulate to come back and present a picture of what,
Look, if we invest in this area, we can expect this type of return. If we expect in this, maybe we want to do a multi-prong approach. And what I love is, you know, this whole function actually came from the design side, right?
where we were trying to elevate the impact that design has by triggering these kind of design strategy discussions, blending design with the business. And it is so popular that now it's been pulled out. It's been rebranded as product strategy. And I'm proud to say, like, they're already coming back with some really impactful decisions that allow us, again, to have more confidence
in whatever direction we choose. And for me, I feel really proud that we've been able to kind of grow what comes from design and research into actually moving up the ladder to make big decisions for the company overall.
I have two more remaining questions. One of them is, you mentioned user-informed intuition, which helps you make some decisions impersonating the user sort of in your head. And how does that relate to the mantra of you are not your customer that everybody should keep in mind? How do you balance the vision that it feels like you know the user with this realistic check?
Yeah. I mean, what is your intuition based off of? Is it based off of me just using my product when I'm at work and I think I'm my user? Is it my close group of professionals who also happen to be just like me and they agree with me or my wife? So what is your intuition based on? And if it is a very narrow set of experience or it's not a very diverse experience,
set of perspectives that has informed your perspective, that's probably not super safe. However, if you've been in an industry for a long time, if you've gone out and you've had conversations with customers, a wide diverse set of customers, if you've seen research coming back and you're looking at that, over time you develop a sense that
of what's going to work and a more accurate idea of what your customers are really like. The danger is that things can change. Attitudes can change. Behaviors can change. So if you don't have a constant kind of feed of revised information to inform that intuition, then you could be lagging behind and making decisions that no longer match your market or your customers.
Fabulous answer. Thank you very much. And one more question. How do you make decisions about aesthetic choices when it's not just about pure functionality, but the new logo or the new branding or what color the button should be? And if you can afford yourself a black button, even though every usability testing in the world should say that the button should be green, you know, these kind of things. Totally. Yeah.
There's a whole bunch of preference testing that you can do. First impression testing, brand testing. That's like, Hey, what, how do you feel based on this, this ad that you saw for five seconds? Tell me, what did you see? How did it make you feel? There's also, it is not all about usability and logic and things. It's about you test for feelings too. There's a whole array of ways that you can inform your,
Your creative work? Is the message coming through? You can have specific brand attributes
funny, approachable, exciting. And you can test against that, a particular creative or a website or whatever to know whether you're hitting the marks. And if you're doing multiple options, which ones get closer to the attributes that you're wanting to communicate. So yeah, that's a whole other space that gets really exciting.
I think that's the perfect segue into asking you where can people learn more about this and about user testing and about yourself personally. So for user testing, that is usertesting.com. Make it really simple. And for me, the best place to be honest is LinkedIn. That is my social platform of choice. And you can keep up with my latest talks or podcasts.
posts or thoughts and I welcome connecting with folks that are in this industry. These are my people and so I welcome the conversation. Well, thank you so much, Jason, for joining us today, for sharing your wisdom and we wish you great growth and further culture development at user testing. Sounds like you're doing great. Thank you so much, Jane. This was really great.
Thank you and have a wonderful rest of your week. You too.