cover of episode #76 How Mental Models Shape AI Design

#76 How Mental Models Shape AI Design

2024/7/25
logo of podcast Future of UX | Your Design, Tech and User Experience Podcast | AI Design

Future of UX | Your Design, Tech and User Experience Podcast | AI Design

Chapters

This chapter introduces mental models and their significance in designing user-friendly AI applications, emphasizing how understanding these models can lead to intuitive and satisfying user experiences.

Shownotes Transcript

Hello friends and welcome back to the future of UX. My name is Patricia Reiners, I'm your host for this podcast and I'm a UX and Innovation Designer from Zurich, Switzerland. So nice that you are here. In this podcast episode we will dive into the whole topic of mental models and why it's so important to

to understand mental models as the designer when you are creating AI-driven products. We will uncover why understanding mental models is crucial for making user-friendly AI applications that not only meet but exceed user expectations. So, it will be a super, super interesting deep dive where we focus on AI and how to design AI products.

So we're going to talk a lot about mental models. So first, let's clarify what are mental models. Mental models are basically the way we interact with technology. Simply put, mental models are the way we think things work. They help us predict how things will behave and how we should interact with them. So what exactly are mental models now? Mental models are our brain's way of making sense of the world.

They are the beliefs, the ideas and the assumptions we have about how something works based on our past experiences. So, when we encounter something new, we use these models to understand and navigate it. Example: Imagine you're trying to open a door. You probably have a mental model that tells you if there is a handle,

You need to pull or to push. And if there's a knob, you turn it. And these assumptions help you interact with the door without having to think too much about it. So how mental models affect our interaction with technology. When it comes to technology, mental models play a huge role in how we use devices and applications.

If a new app or gadgets work in a way that's similar to something we already know, we find it easier to use. But if it's maybe too different from our existing mental models or from things that we experienced in the past, we might struggle with it because we don't really know what to do. We don't have a mental model that we can use for the interaction. So when you think about your smartphone, you probably have a mental model for how apps are open just by tapping on it.

and how to navigate just by swiping left or right. And this model is based on your experiences with similar devices and apps. And if a new app follows these familiar patterns, you will likely find it easy to use. But if it introduces completely new gestures and interactions, you might find it very confusing at first. So,

Why are these mental models important for AI design? So for AI designers, for designers who are working on AI products, understanding mental models is crucial. Because when we're creating AI-driven products, we need to consider how the user's existing mental models are to make the technology very intuitive and easy to use. So

If an AR product aligns well with what users already know, it can make the experience smoother and more satisfying.

Thinking about voice assistants like Siri or Alexa are designed to fit into our mental model of having a conversation with a friend. So we ask them questions or give commands in natural language, just like we could with another person. And this makes the interaction feel more intuitive and less like using a complex piece of technology, right? So mental models are essential for how we understand and interact with the world around us, including technology.

And by addressing AI products that align with users' mental models, we as designers can create more intuitive and user-friendly experiences. So next time you think about a new gadget or app or even an AI product, you can think about the mental models that are behind it.

Before we are getting started with the topic, I have a little mini announcement. Air for Designers is going in the next or it's going to the next round.

If you haven't heard of AI for Designers, it's a six-week intensive program, an AI bootcamp for designers where you learn step by step, week by week, how to use AI tools as the designers to be more productive, to really use the tools that you have currently to get up to speed. But you're also going to learn how to design AI products, how that works, how you get started.

So both sides of AI.

Week by week, you have a different focus. You have tasks where you get feedback on. You learn the tools. You learn the methodologies. You get a lot of hands-on experience. And we have, I think, like over seven live calls. We work together in the group where you get exercises, a lot of input from me, where we go through the different workflows. So it's an amazing course. People loved it so much. You can find more information also in the description box.

And my recommendation is to sign up for the waiting list. Then you will get notified when the doors are opening and you also get different bonuses that you don't get if you just sign up when the doors open. So if you're on the waiting list, you'll get these different bonuses only if you're on the waiting list. So my recommendation is sign up.

You get notified and then you decide if you have time, if this is a good fit for you in that moment. But sign up. You can find all the information in the description box here underneath this podcast. Then super excited to see you hopefully in AI for Designers soon. But now we are diving into the exciting topic of mental models in AI. First of all,

There's a huge discussion about like, how do I design AI products? And I get this question all the time. Some people think like, oh, it's basically a chatbot like ChatGPT where people or where the user can like input certain things and then they get a response. But this is actually not the case. This is also not how the future of AI models will look like. Yes, it might be a part to have a chat interface where you can chat with an AI assistant that can be part of the product.

But this definitely, and I promise you that, won't be everything. People are very graphical, are very visual beings, right? Like people need to see things. They don't always want to type. So

What definitely won't happen is that we have chat-based interfaces for every service or every app. It will be a part of it, yes, for sure, but not everything. We will have graphical user interfaces for sure in the future. So how do we design those with AI? And there we are coming to the very first and super important point, which is if you want to design great AI products, the main and most important thing is design.

The first thing, understanding and predicting user needs. The interesting thing with AI is that you are suddenly capable of learning and understanding human behavior and then predicting what they... Let's start with a simple idea. Imagine an AI that truly gets you. It listens to your preferences, knows what you like and even predict what you will need next.

And this isn't just science fiction, it's the magic of aligning AI with user-mental models. For example, I think about Netflix or Amazon. These platforms use your past interactions, the past series or movies you watched, to recommend shows or, in the case of Amazon, products, making your experience feel more personalized and spot-on.

And this kind of alignment make users feel understood and builds trust in their technology. So think about a user, just like one person who really loves mystery novels. And over time, her e-book app learns this and starts suggesting new releases in the mystery genre.

So this user feels like the app really knows her and she ends up spending more time and money on it because it constantly delivering what she wants. And this is pretty interesting, right? Because all of this is still based on understanding and predicting user needs, something that we are already very familiar with SEOX design, right? So if we are designing with AI or for AI products,

It's still very similar. We are still focusing on understanding and predicting the user's user needs. The only thing that's now possible with AI is personalizing content for the user and predicting what the user wants. So this means that not every interface needs to look the same, but we can predict what kind of content is important for the user and when. This is the first important thing.

The second is that we can redefine user interactions. It isn't just about reinforcing what we already know. It can also introduce us to new ways of interacting with technology. I already mentioned the chatbots or there are also voice assistants like Siri or Alexa. And these devices challenge ways we interact with our devices, moving from touchscreens to voice command.

And this shift might seem maybe strange at first. I remember when these technologies were introduced. But it opens up new possibilities for how we engage with technology. And embracing these new interaction styles can help us grow and adapt, making our technology experience even richer. I think...

When we are thinking about the future, we can't really get our head around of what will be possible in the near future soon. I already touched base on it in the first passage. Personalization will be a huge topic because this is suddenly possible. Personalize content for the user. Personalize interfaces.

This is super interesting from your ex perspective, right? Because we still need some kind of consistency. We can't just like rearrange things completely. There still needs to be some kind of consistency. Even for the user, right? Like it would be very difficult if the user opens the app and he sees something completely different each time and doesn't really know what to expect or where to find the content. So we need personalization and this is a huge game changer, right?

But we also need consistency. So maybe some of you remember when smartphones first came out. Many people were very skeptical about touchscreens. They preferred physical buttons, right? Because you could feel it. It's easier to click on. But now touchscreens are the norm. Voice assistants are a similar path.

People use Alexa to manage grocery lists or a schedule or use the JGPT app with the voice integration. And sometimes it takes people a while to get used to it, but now we can't imagine a life without it. So what's the role of user experience design in all of this when we are thinking about how to design AI products, the mental models for it?

So user experience design or user-centered design is all about creating products that fit seamlessly into our lives. And AI can enhance this by offering personalized experiences. I'm talking a lot about it, but I think this is absolutely fascinating from the X perspective. And these experiences adapt to the individual user.

For example, educational platforms powered by AI can adjust the difficulty of content based on a learner's progress, right? So depends very much on how fast the person learns, what kind of content it like the person sees. What is the next step that the user needs to do in this in the software, in this app, right? Like in this program.

and this really ensures that they are always challenged but not overwhelmed so you can personalize the whole experience and this personalized approach supports these existing mental models while helping users develop new more efficient way of thinking so what do we learn from this when we want to design ai products

User-centered design combining with AI and mental models create a dynamic environment where technology not only adapts to us but also helps us grow. And by understanding and leveraging these mental models, AI designers can create intuitive systems that improve user interaction and satisfaction.