Hello friends and welcome back to another exciting episode of the future of UX. I'm your host Patricia Reiners bringing you the latest insights and trends in the world of user experience design, technology, AI and beyond. And today we are diving into the highlights of Apple's worldwide developer conference where groundbreaking software updates,
took the center stage. So they discussed a lot of interesting updates. From AI advancements to personalization features and a complete overhaul of Siri. There's so much to unpack. And we will explore what these changes mean for UX designers, so for us.
and how they could shape the future of our digital interaction. So stay tuned for an episode packed with innovation, inspiration and a glimpse into the future of design. Let's get started.
So the WWDC was at the beginning of this week on Monday, the Worldwide Developer Conference by Apple. And they are always sharing a keynote where they are presenting the newest updates, sometimes newest hardware. And this time they presented only software. So only new software was introduced, which was great.
still super interesting because a lot was expected. AI is currently everywhere, large language models are everywhere and everyone is using them. And when you compare ChatGPT or Google Gemini to Siri, it gets really obvious that Siri lags behind in comparison. No one uses Siri maybe for a timer or for adding something to your calendar.
But Zeri is not really working and I think Zeri needed an update. We're also seeing that there are more and more AI products that we have often discussed in our podcast, like the AI Pin or the Rabbit R1, if you remember. And they focused on AI for interaction. So no apps, just AI. They're not related with Apple, not at all.
But they're showing a super interesting direction where things might be headed. So with a large language model or in the case of the Robert R1 you remember like the tiny orange device that basically looked like a mini Game Boy used a large action model.
And this was probably mainly marketing, but the idea behind it is pretty brilliant. Instead of just creating text-large action models, can also perform actions directly. ChatGPT unfortunately can't do that because ChatGPT doesn't have access to our apps. It has access through the GPTs that we prepare or maybe certain automations that we integrate with Zapier, but it can't really book an Uber for us or can't really play...
music on Spotify, right? Because it's like it's not connected and it probably will never be that way. So there have been a lot of rumors that there are significant Ziri updates. So Ziri will be on the level not of the R1 or these AI devices that have this AI first approach, but that there will be some updates. And this was also probably the reason that
why the AI product that I just talked about, the AI pin and the Rabbit R1, were launched prematurely, underbaked. So they were just basically thrown on the market, although they weren't ready.
Which unfortunately led to their failure. Both would have needed much more time. But they knew that the Worldwide Developer Conference was coming up and that there will be some changes and AI adjustments presented by Apple. And then, yeah, their AI products probably won't be needed anymore.
So I say there were some rumors about major AI updates and I must say I was personally super excited, watched the entire keynote and was just like, you know, got some popcorn, got it comfortable on my balcony and just like watched it. By the way, you can re-watch the whole keynote if you want. I'm going to link the keynote in the description box in case you just want to re-watch it or watch it for the first time.
And now I want to go through the updates and also talk a little bit about what's important for us as designers. What do we need to know? What updates were actually introduced? First of all, I already mentioned that no new hardware. Unlike last year, where we saw new hardware like the Apple Vision Pro headset and new Macs, this year's event was all about software. But there is speculation about future updates for...
peripherals and possibly a new Apple TV model.
They also introduced the new macOS updates named Sequoia, includes many of the same AI enhancement as iOS 18. You will see improvements in Apple Music, in Notes, in other productivity apps. Additionally, Apple has reorganized the system settings to make them more user-friendly. There are a lot of different updates around the systems that they're using. So tvOS 18, HomePod, Software 18.
And they also introduced the Vision OS 2 for the Apple Vision Pro headset. They got some new features too, although specifics are still emerging. So let's see how this looks like in the end. One of the long-awaited updates is the introduction of the Calculator app for iPad. Yeah, so...
I didn't follow this whole discussion, but I heard that the internet has been full of people complaining that you even have the calculator app on the Apple Vision Pro, but not on the iPad. So now you have it on the iPad and the app will integrate advanced features like unit conversations and the history view similar to its iPhone counterpart, but tailored for the iPad's larger screen.
And one feature that I find particularly interesting is that you have a third notes section where you can draw sketches or even draw or make notes of a calculation that you make. And then it automatically calculates the calculation. If you do certain changes also with sketches that you might draw with the calculations, also the result changes. So pretty interesting.
The next thing that I find super interesting was the huge focus on personalization. iOS 18 will allow users to place app icons and widgets basically anywhere on the home screen, breaking away from the traditional grid layout. And users will be able to recolor app icons.
Which means now app icons basically have the same have had their similar color have their distinct color right so the apple tv app for example is purple and then you have the photo app which has like the rainbow color so they all have their specific color and then you will be able to all color them red for example. Which I think from a design perspective is the the worst thing that you can do
So I think this decision is somewhat questionable because suddenly all app icons look extremely similar, which makes it so hard to find the app, right? They have the same size, they have the same color, they have a different shape on it, so a different icon, but it still looks very similar.
But in the end, I'm pretty sure they had their reasons behind that. Maybe they recognized in research that people don't really use their home screen that much, but they go through the search all the time, right? So if you're looking for the Instagram app, for example, you just like basically swipe down, then the search bar appears, you type in Insta and then the Instagram app.
app appears below. So maybe this is something they discovered in research and this is the reason why they think that the personalization topic is more important than discoverability. And another interesting thing about personalization is you will be able to create custom emojis based on a description thanks to an AI integration. So at the moment you have certain emojis that you can use for example
a bagel, a croissant, a heart emoji, you know, like all the emojis that we know. But with this new update, you will be able to create your own emojis. For example, I would an emoji with like a dog wearing glasses, writing something on a post-it, which would be my own dog Wilma doing, you know, like post-it work, pretending to be a UX designer dog. And then I could just create this emoji myself and then share it with my friends in WhatsApp or iMessage.
Before we get into the AI topic and the Siri updates, I just want to say a few words about personalization. You can see how strong the focus on personalization has become. Making my Apple device truly mine, moving away from standard branding towards personalized design and experience. So you can now create custom emojis like the dog with the post-its and glasses.
which I think is definitely super interesting and exciting. Integrating the user more as a creator. And we will see this in interfaces sooner or later as well. Personalization will adapt colors, information, and design to each particular user because of AI. AI learns the patterns, it learns about the users, and then adjusts the personalized interface. And this is only possible because of AI and will also impact
us UX and UI designers in the upcoming year. So currently it is still hard to imagine, but this is what about to come. By the way, if you like the podcast, I would really, really appreciate a five-star rating. It helps me continue producing the podcast, researching exciting topics and inviting amazing guests. So ratings are super quick. You know, it just takes a couple of seconds and
And I also used to not rate the podcast that I love and that I'm listening to. But now I said, no, all the podcasts that I love, I want to support the creators. It's so easy to do that. Just give a five star rating. You know, there's always so much love and so much work in creating a podcast or a free resource. And as a listener, you can just help the creator so easily.
by just reviewing the podcast or even writing a review. I really love to read the reviews. I would be so, so, so grateful. Thank you so much for your support. Now we are moving into the Siri updates and to actions.
So let's switch to the long awaited Siri updates. Siri was practically useless before. I don't know when the last update was, but I only used it for setting calendar entries, timers or alarms. But Siri has received a complete upgrade. First of all,
Siri got actions across apps. So Siri can take actions within and across apps. For instance, you can send an article from Apple News to a group thread and messages. Third-party apps will also benefit from Siri's new enhancements. So you cannot only talk with Siri, but Siri is able to actually perform actions in your apps.
So a little bit what we have seen at the Rabbit R1, what they call the large action model. The good thing with Siri is that it really leverages broader personal context, allowing it to search through your entire device. You can ask it to show me things I sent to Todd last week or go through my photos and find a picture of my passport. And it will find the photo and then send an email to Todd with the picture of the password and notify me when he responds.
And you don't need to do that via voice commands. You can also use text-based interactions like with ChatGPT, for example. So you can interact with Siri via text, correct your statements in real time and enjoy a new and more integrated look within this operating system. And the super interesting thing is, and I talked about it, I think like two episodes ago, where I dove a little bit into the rumors about the new upgrades.
JGPT has been integrated. So Apple announced that
Siri can tap into OpenAI's ChatGPT when needed. And they actually solved it super interesting. So ChatGPT is not a part of Siri or the process. But if there is a certain question that Siri can answer, there's an option to switch to ChatGPT. And then you are asked, should I share this conversation with ChatGPT? Should I share this image? So ChatGPT doesn't have any access to your content on your phone.
Siri only provides with your consent the information that you are allowing it to. And I think this is super important also from a privacy perspective. JGPT won't be the only large language model that will be integrated. There probably will also be some others, but yeah, they haven't mentioned or haven't named any specifics yet. But what does this mean for UX? What does it mean for UX? This is the most important question.
Siri's ability to take actions autonomously might remind some of you of the Rabbit R1 and the large action model. And from a user experience perspective, this is interesting. Instead of users having to go through every step from problem to solution, like opening an app, finding the right information that they need, clicking on a button, sharing that, they simply make a request and Siri hands the rest across various apps. This represents a whole new way of interaction.
Additionally, apps can now have predefined actions for Siri. As UX designers, this is something we need to keep on our radar when developing apps. By the way, Jacob Nielsen wrote a few interesting articles about the shift from command-based to intent-based outcome specification. So a little bit what the large action models are doing or what actions will be doing. I linked the article in the description box. So if you want to read it, please check it out.
So thank you so much for listening. I hope you enjoyed this podcast episode. Feel free to say hi on Instagram at ux.patricia or on LinkedIn. I'm always super happy to connect. Thank you so much for listening. Don't forget to rate the podcast. That would be so helpful for me and then hear you in the future.