Welcome back to another episode of the macro, a show. Good afternoon, heartless. Big news, big news. I was not expecting this.
at least was I.
I don't know when I was expecting this, to be honest. IOS eighteen point two dropped in beta form, so let's just press all of our discussion right now by saying this is a beta and that should speak for itself.
But I S teen point two dropped and what was significant about that is next week i'll have I O S eighteen point one to the public and that'll give you the initial apple intel gent features that i've been using in data form for like two months now, a month a while, a good amount time. But on eighteen point two, we got is that the rest of the features or like pretty darn close to everything else that's kind of missing um we got image playground, je oji image wand, h ChatGPT integration, visual intelligence. If you have an iphone sixteen or sixteen pro, what else is missing? How much of the series contextual awareness is there? Is that the biggest thing?
Well, I would argue that we're still missing the big things. I think we got big things. I think we're missing the things that actually make apple intelligence interesting, which is the personalized contextual awareness that you are supposed to get with theory and on screen awareness as well and the ability to tile this stuff together.
Because right now, what we effectives have all these little tools, so we got writing tools and yeah genoc um all these little features that we've heard about for a while, visual intelligence being quite an interesting one that i've been trying out over the past twelve hours also. Um so we're getting there, but there's still a probably two more updates to come. I O S eighteen point three and I S eighteen point four. There should fill the rest of apple intelligence south in terms of what was originally unveiled at wwdc this year.
So yeah, I mentioned that the series stuff is probably because I wouldn't honest, I didn't even know how to like check for that yesterday because i'm like, well, what do I ask you to make IT like contextual? Where I asked IT one was my next dentist appointment. I think you could probably do I would never ask that for some reason, but that's I got a very common thing and IT brought IT up.
But then I was like A I could probably do that already because it's just reading your calendar, right? So then you have to like what would be good things to test that for to like, no, the contextual legal and the keynote IT was like, once my mom's flight coming in or something like that, right? And then they would follow up questions from that.
I got to think of something so that when IT does come or like whenever a new beta drops, I can immediately try to test this out because we don't usually get a list of things that are dropping in these beta. We kind of just have to go and find him for ourselves, which is fun and infuriating. I don't know.
Have you had a chance to test IT out? Oh, you can't, right, because it's U. S. only. Or can you try some of these?
Well, there's two answers to that question, which is previously, yes, I could because I just set my device to U. S. English but now as of I S eighteen point two, I actually have switched my device back to british english because most english speaking countries of their variants of english added uh, with I S C T point two.
So uh australian english, canadian english and uh british english, those are all now enable. So people in those countries will not need to, uh by Christmas um update uh the device language ever s on monday. Um if you're in the U K.
Or canada, wherever IT may be outside the us, you will need to actually change your device language to U. S. English should be able to explore apple intelligence.
That's good to know see as someone who is in the U S, I don't we often think about those things because IT came and dropped for us. Um so yeah that's a good insight um for those of you abroad and would like to try IT out. Um so what are your thoughts on eighteen point two then? What's your favorite thing so far that you've mess around with?
Well, i've had a chance to check out almost everything. The only thing that I haven't got is image player around image Wanda genoc because he hasn't granted me access to a yet. I've joined the way.
So that's another thing you yeah IT has .
not given me admission yet. I don't know that because I move back to a british english, I have a slight spicion that IT is doing IT for that reason because also tim someone, our team who also is british and resuming has his device set to british english also doesn't have IT. But IT seems like all the U S uh macro as people do have IT. So I wonder if that has something to do with them. We're being treated the second place.
Did you do IT right? No, stop IT. Did you? Did you do IT right away? Yeah, the request right away in the upload in the day of the update.
Okay, yes. yeah. Cause I I got I got granted access within like five minutes of requesting IT. I got a notification that was like because I thought, you know, let's peel back the curtain little bit for those of you out there who think make a youtube videos s to send my easy IT can be IT time, especially the way I got things set up. I've kind of got things in like A A system.
But yesterday was ipad mini released today, and I did not get an ipad mini early, so I had to go pick IT up and then make a video on my first impressions. While the news is fresh on the released day, I started at ten A N magically somehow got that done around one, and then got the notification that, oh, hey, by the way, eighteen point two just dropped. You're gonna to do another video based on the new apple intelligence features.
I did two videos, and I think I posted that one around the ray. I was done by three. I think we might to post a lap on accident.
Um so two videos in a span of five hours, six hours is tough, mentally and physically draining. So I was really happy that they granted me access to do this right away because I thought I was going to be waiting for a while. And so I was starting to film some of the other stuff.
And then the notification popped up and not like, right? Thank you. I can keep going down the list of things I do not enjoy the image, playground images of myself that had made and it's mostly because I think they look bad.
But in the sense of like that's the way I look because they look like me. It's very disappointing to see that in a cricketer for like a dam. I really do kind of a look like that.
IT doesn't really good job. I think they look really nice. But i'm like, man, is that really how I look? Damn, that's tough.
Ah, how did your images did you oit? You couldn't do IT sorry, i'm going to do that a couple times my bed. Um I cannot wait to see what your images do look like that. Hey, guys, just want to take a quick break. Let you know that the episode of the mar mar show is sponsored by turtle beach.
It's been a minute since i've done some serious gaming, but with the new call of duty coming out, I had to brush up on my skills and get the old squad back together for old times sake. But one of the things I noticed was just how bad my headphones were. I felt like I could barely hear anyone coming up behind me.
I couldn't really hear my friends trying to tell me what to do or if something was near me. So I finally had enough and decided to give these stealth seven hundred headset a try from turtle beach, their new genre. Stealth seven hundred picks up so much Crystal clear audio that I feel like i'm basically cheating.
I could hear flies buzzing around me that I honestly thought that they were actually in my house and not just part of the game. You'll hear every footstep, every reload, every enemy trying to speak up on you, and you've able to take them out because you know that they're coming. Not to mention that these are just so incredibly comfortable that I started to wear them for other things, like listening to music too.
I can wear them for hours while gaming hours for listing the music whenever the case might be. And my ears were never fatigued, and the battery last a long time. We all have that guy on our team whose heads that just dies at the absolute worst times.
Well, that was me. I was that guy, and now I haven't insane eighty hours of battery life to work with, so i'm no longer that guy. Get the ultimate immersive gaming experience with turtle beach today for a limited time only had a turtle beach 点 com and use code mac M A C for ten percent off your entire order。
That's ten percent off your entire order at turtle beach docs with promotion de mac. That's M A C mac. Thanks, turtle beach for sponsoring this episode.
So I guess i'll walk everybody through. We're quickly just a high level you know insight into how IT works. IT works really well. First off, my phone was getting kind of hot.
I do you want to stress and IT could be just because you know new update immediately following that, it's just a little bit warm at its doing a bunch of things. But I kind of notice last night when I was message around, again, IT does get a little hot when you're starting to generate images and genoc. I specifically those two tasks as when I noticed at the most actually at one point and again, beta.
So we have to keep that in mind. But at one point my phone just completely restarted, like not like the spring restart or it's a little thing and that IT goes back to the ck screen, fully turned off, fully restarted. Uh, IT just had enough.
I was trying to generate uh, gene modus of every single person in my family, which, by the way, those gene modus looks like terrifying ly realistic of my kids. IT was actually kind of fun doing all of that, but it's pretty cool. So you go into either the image playground APP or you can do IT inside of messages, just kind of hit that plus button.
Then you'll see IT in the list of options for j mogi. It's in the a mogi section top right corner of IT. And you can just start typing in props.
You can use the suggestions that apple provides if you don't know what you want to generate uh and then within like a few seconds, that will give you a few different options and them down if you like them if you want new set and that's IT and that just gets implemented into whatever you know the notes APP or the message is, whatever you're using IT in. Um and like I said, it's pretty seamless. And I think some of the things that you can do or are pretty good, do you have any questions tightly on your end about IT? And he couldn't .
mess around with IT. I suppose what i'm intrigued is just how much the are of good quality because you're presented with multiple outcomes. Do you find that they are pretty consistent tour? Are there lots of sort of strays artifacts in some of them? And you will have to be aware that, okay, this one is has not been generated too well.
No, yeah, you're right. I I do. So when I say IT works pretty well, it's more of like you know like how well the actual process goes from, like creating one.
But some of the prompts, depending on what you type in, that's where it's going to need some more work. So I did one, for example, of like two thumbs down, like I wanted IT to be just basically two emogene. And one of like two thumbs down, some sort of the different variations of that.
And IT didn't do IT. IT just gave me the Normal dumbs up aoi. And then the other option was a thom's downward that was really IT.
There were some random weird things that we're happening. And then I, my god, put IT with me like I typed down and then IT like kind of highlights IT. And you'll see it's a person.
And then you can select, by the way, kind appeal in this back a little bit even more. You can select when you type, tap the person. You can like different images for the gene mog and for the image background references to be used off, so you can go your photo library.
so it's not used the photos up understanding of people IT is.
but IT gives you different pictures of what like IT pulls from the photos APP right?
Does that make sense? Yes, that make sense. It's makes good because I was concerned that I would have to actually go into photos and its what I wanted to to use.
No, no. And then even pulls up a list of people. So for me, IT pulled up everybody in my family immediately right there and and then IT gives and IT bases a bunch of different you know, IT pulls from a different bunch of different pictures of that person. And so you can go ahead adjust that or you can go find once specifically in the photos up that you might want that I didn't already auto populate with.
Um so then once you do that, you know I typed in down with two thumbs down and that I did a pretty good job of there were a couple of bad ones, but eventually in that set that that gives you like three or four, there was one that had me given you the thumbs down um and yeah, I mean, it's just it's kind of hit miss with the prompts. But overall as a whole, especially in the first beta of that, I think it's working pretty well. Do I stand by my thing last week where I said this was going to be like a major feature that everybody is is going to use?
I don't know. I'm not saying no and i'm not saying, ah okay, I still stand behind IT one hundred percent. I can see kids and and Younger people still thinking that this is like really cool.
I I had a lot of fun with IT. Am I going to use IT all the time? I don't know. IT was fun and fresh and exciting because IT just happened to be new at the time. So talk to me next week and ask me how many gene rogie i've created.
I guess, once you've created one or two that you may want for specific instances, you may continue to use them. But I think that. The charts are actually using gene og into the creating a new mog on a weekly basis, even probably quite unlikely.
Yeah, I mean, you're right. I don't think you're going to be making a whole bunch of on a daily or weekly basis. But the good news is when you find ones that you do like, they automatically get added to your library so you just can go and choose from those.
And also, I did some research, uh, A K I, I sent some to a person that I know that has an android phone, and I asked them to send me a screen shot of what this looks like back to me because I was very curious of what image, playground images and what geno gies look like, and they look exactly the same. So cool. Could also to apple for making that work across different platforms.
I don't know if that was maybe because of rcs pulling some of the network because this person did have rcs on. So I guess I can try again with an conversation, but at least an R C S. IT looked exactly the same.
And guy also said that I was terrifying for the images that I sent him of myself driving a car, riding a skateboard, which I don't disagree with him. That was pretty uncanny. So yeah, that's that's the generative image portion of IOS eighteen point two has got I got that I got my yeah.
I think it's just it's not it's not the part of the update that intrigues me. And I don't I don't want to be too much of a doubt or on the rest of IT because I was not people been calling .
you out for being a downers lately.
I know I can't I can't help IT and this is not going to help because I have not been that impressed with a ChatGPT integration OK. So that's .
something that you have tried. So so let's enough talking from my end. Let's hear you people want to hear you more anyways, let's hear what you got to say about those things.
So now a ChatGPT is integrated directly in siri. You can initially will be prompted when a request is going to get fed ChatGPT. But you can, in settings, turn turn this off so that you don't get prompted every time, which I have done. You can also sign into A A R GPT account if you have one and this means that you can use the more advanced models um more regularly um and this all sounds good um and it's Better that it's there um definitely but I think what has sort of surprised me about this is it's just easier to use ChatGPT itself then go through siri well yeah I mean if you if you .
know that you want to use ChatGPT in the first place then yeah just go right to IT. But the point of IT is so that if you ask something to theory that you know off the top your head that you're thinking over, you wanted to do something and IT can't, then IT conveniently does ask ChatGPT for you. I think that's like the major portion of that whole part right there. My off there yes.
Um I just don't know if I love the way that it's been implemented, so I don't love the way that I will provide the response and then say that's from ChatGPT IT. IT feels like IT adds a lot of friction and it's unable to switch out of that mode um easily. So um it's not conversational because if I ask you to do something and IT sends that request of chat p and I ask you about what's coming up in my calendar today, IT can't do that. IT then is using ChatGPT and ChatGPT says I don't have access to your calendar.
I guess I was going to have follow up like that yeah where .
immediate stuck on IT anything else IT can't and that's quite annoying. And I just sort of feel that IT isn't particularly cohesive. I know that this is early days because we will get there eventually when siri has the personalized context. But so far I have actually just found IT kind of infuriating because syria still so limited incredibly limited um and IT is still been unable to fulfill you and basic requests.
I am still having difficulty with basic home kit requests and even when I was asking IT to um pull up my calendar for tomorrow, I had to watch that I was phrasing IT in very specific ways to actually get IT to do that um up because with slight variations is still not completing that basic command and that is so frustrating because I really hope that he would have done that by now and I also would have hope that IT would begin moving information to ChatGPT as I needed IT to um but IT is currently not able to do that so if I say to IT um I have disappointment at this location tomorrow um from a common with my calendar um I can't then ask chat to be one of the opening times for that location. They would have to be separate requests and IT just feels what I suppose what i'm trying to say is that IT feels like ChatGPT is just so far in a way Better then um theory and so serious. Just kind of annoying me when i'm actually going to use this feature. And I just want to use ChatGPT by itself because syria is just not useful.
I guess. So I guess I could understand that. I would imagine that the reason why it's you know, if you wanted to move all that personal stuff too, jt ChatGPT, anything apples going to want to do that? That's kind of the whole point of like the .
privacy aspect, but you should be able to do that eventually.
And so this the option like a ur say that you want them to do that.
that is supposed to be the key selling point with upper the ability to sort of pass information through these different systems. So whether that be on screen awareness, whether that be to take actions on things in apps um while also integrating with syria itself and ChatGPT yes and IT move things between these different areas of the device.
But IT IT is just really, I suppose, highlighting the inadequacies of siri to me right now um and where I have been experimenting with asking IT more unusual requests. It's just no way there as good as chat P T. And I would go as far as to say um even hearing chat pt responses through the voice version of theory a significantly worse than the voice version of ChatGPT significantly worse.
So if I want to have a conversation which at GPT, which very occasionally I might do, if i'm trying to brainstorm something that I want to talk through an idea um I would still go to ChatGPT for that directly and I wondered if I might use theory but IT just feels clanchy and all that. It's telling me that the request came through ChatGPT and IT doesn't sound that great with the way that serious pronouncer ation works. IT just doesn't seem to hold the threats conversation anywhere here as well um it's it's surprisingly worse than just using ChatGPT by itself, at least in my experience so far.
And I do understand this is early, but what we have to remember with this is, is a balancing act, right? Because on the one hand, you could rightly say to me um IT is going to be a while before this is finished. So give you a fair chance, understand that this is not there yet.
But on the other hand, a lot of people of bore iphone sixteens now because apple is heavily advertised that these are apple intelligence devices now um they are built to apple intelligence theyve made the cube all set up in the apple intelligence colors. Every ad you see for the iphone sixteen is so apple intelligence heavy, and they are rolling these features out bit by bit. So people are going to use ordinary people are going to use this version of theory, rich ChatGPT in eighteen point two.
So even though this is a bit of ocean now, ordinary people are going to use this exact version, this exact way that IT is supposed to work as well. So I understand that is there is more coming later, but IT is not it's not quite there yet. And I kind of I kind of wish .
they would .
have held off until I was ready. And he feels like they're just .
rushing well. And I know you know this, to be fair, I think the other their features that will be coming out next week are pretty good and they're ready. But the eighteen point two one that is still a beta. And how long did IT take for the eighteen point one features to roll out? When did when did that officially get released?
Was that subtle about two months?
yeah. So it's been a couple of months. I think this is going to be I would feel like this would be a little longer, right? I think this might take a little bit more time.
And even if IT does, that's, that's fine and understandable. I suppose what I just find a little bit disapointment that these features and this new version of theory, the way that IT is being broken down and really slowly, I don't think, is great for the image of apple intelligence because on monday, there are going to be people with iphone fifty POS that are going to think, oh, i've got apple intelligence now i've got the new version of siri, look at lights up in the new way.
And they are going defined that they actually have the exact same version of theory they've always had. And that is misleading. And they are going to think this isn't great. What is the first here?
IT does light up in that cool new way though.
Yeah, and that's pretty much. Well, I actually the only other theory feature that you get and I was sitting with one, is that you can ask questions about apple devices, which is quite cool because it's leveraging all of the apple support documents um but I just don't know if releasing things in this way, particularly with theory, is IT really makes a lot of sense because it's not really clear what I can and can't do.
I mean, even you and I the started st conversation were not entirely clear on how we contest its current capabilities. And if we in the tech community are confused about that, um how is an ordinary user going to feel they are just going to feel like, so I bought this new iphone sixteen and now syria is glory and different, but it's the same. There needs to be like .
a full breakdown from apple of like, here's everything new from theory. I would make that video right now for everybody if I knew what was new in theory because like you said, i'm only going off of the things that they showed us at the event. And I didn't even get the briefing for theory.
I got briefings for everything else. What would have been the most useful thing would have been to hear more information about theory, and I don't even know how much more new things they even really talked about aside from what was at the event this was back at wwdc. So it's been months.
um. Yeah, I mean, i'm remaining optimistic because I do really like some of these features, and I do think they work well. And I think somebody is going to find these incredibly useful.
I think ChatGPT integrated into notes where you can type in prompts and stuff is super useful. Having that baked into these documents and or these editors and being able to you know have ChatGPT at you're disposal without having to go into the APP or the website. That's pretty helpful.
Um the on screen thing I I tested that yesterday um where I said what am I looking at and I gave me with the ChatGPT and I gave me a full breakdown of what did I have up some random like you know a pop socket. There was like a another company that makes something similar and I gave me like a full break out of the company and what they make I would not I don't care. But I mean, that could be useful for something else.
You're looking at something on screen maybe. I don't know an image of something that you're like. What is that? I don't know what that car is or something maybe ChatGPT can help you out. There can be some really helpful things there.
You don't have to really move, you know your finger is so to speak ah to go to another upper service and have the google something. So um I do have one question. If apple was so like in love with this idea and everybody was moving into the A I space and we saw companies and apples, I go I we're going to do the same thing and it's been a while. Are you surprised that they just didn't buy one of these company like ChatGPT m, and they probably could have just purchased them and and have had full control over this no? Or was IT too far gone because I was too many people in the public that i've .
had this IT was probably too far gone at that time. I mean, IT apple could have made a little very interesting acquisition over years. He could have acquired tesla years ago, very inexpensively comparable vely. And then he could have Operated IT, just like how IT Operates.
Beats or IT could have absorbed IT and offered its own version of the apple car and not had any of the issues that, that IT is subsequently that is subsequently encounters that project leading up to its cancellation. So I think apple is reluctant to acquire bigger companies um and it's pretty clear that apple is is behind with the eye. I mean, we heard over the weekend that internally supposedly um apple's executive c apple has two years behind um the likes of google OpenAI um and matter with a eyes technology.
Um so it's not really too surprising. Um I just find the rollout of apple. I just I think what that is, I find all of apple intelligence to be underwhelming and not because I don't want these features um but because there is something about the way that this has all been marketed and rolled out that feels incredibly rushed. And actually, if we're being really honest, not particularly life changing, so IT will IT will be later. And m if IT.
and it's only just serious, really, that's gonna the thing that could potentially be life changing. I don't think any of these other features are.
but let's say that the ability to take action across apps and understand contextual when as is not available until I O S eighteen point four um that might be April.
maybe you um .
by then we're well on the way to iphone seventeen um and I O S nineteen will be unveiled wwdc next year. So I find the whole way that the iphone in sixteen lineup has been marketed for apple intelligence. I find the way that, as I say, the new version of siri as of monday, it's going to look different.
It's going to feel flashing new. But functionally, IT is not different. I find all of this quite disingenuous and I think that a lot of IT is marketing spill um but the reality is what you have on monday is writing tools.
No no. Do not undermine what I think is what I think is truly a life changing feature. And then verification summers, yes, that is the most life changing feature in my opinion. I don't know that I could go back without IT.
They are really good, but sometimes .
it's annoying, but sometimes it's annoying. What IT does. Have you noticed this, by the way? And then you can go back on your on your rands on hating my favorite future. What one of the things i've noticed that does bother me is if you don't check like a group chat for a while or I feel like that even does this even if you've have read something. Um but I think it's like if you had a notification from something IT goes like ads in your response to the the summary notification.
So like you look at IT, it's like this happen, this happen and this happen and one of the this happens is something that you wrote and you think that this other person sent you. But as I gone know that's what I sent in the message before hand. Why are you summarize zing? I have where it's like it's been confusing and it's been summarizing past.
Oh, maybe it's more of like it's summarizing past notifications with new ones and adding them in and like the same sentence as if those two things are related and they're not necessarily related. So that's that's where I am having some issues. I don't know if that makes sense, but somebody out there is listening to this whose used in their like, yes, I know exactly what you mean and so for that person were on the same wavelength th, but otherwise I don't know that I could have regular notifications without a summary is so helpful to just see IT is useful.
And I have enjoyed IT. I find IT quite funny when, because obviously IT doesn't actually really understand humor, so if people send like chokey messages, IT summarizes them completely seriously. And the way that does that itself is quite funny. Like I found myself screen shooting the notification summaries and just sending those to people to see how their own messages are pretty quite funny way .
but like if somebody says i'm dead, IT would like literally says, like dan is dead and you're like quite what like that's yeah that's the kind of stuff that i'm talking about sometimes where you like and it'll add that to like another part of the notification that it's summarizing really like how does that make any what is happening in this conversation? But don't you for a IT.
IT is useful. I guess what i'm just saying is if I was an average user and I had bought an iphone sixty and because of apple intelligence and then i'd read some headline saying, oh, I O S eighteen point one apple intelligence now available um I think, oh, what if I what have I got for this thousand dollar device that i've just upgraded to? I've been notification summary es and .
writing tools. Is there anything else i'm trying to think that's probably a bad sign, probably a bad sign.
If I have to think about IT, there are other features, but they are relatively small. You also get things like the clean up tool and photos which I ve been enjoying .
using yeah that actually has I feel like it's gotten Better or maybe i'm just using things where IT makes more sense, but you've used IT in some scenario like removing reflections from window and it's actually worked out really .
well yeah um so that's why I don't I don't want to be misunderstood about this. I like almost all of these features. I even like what is happening with theory.
My issue is that is the way that this has been stylized and package up and the way that it's been rolled out. And I almost would have preferred in a way for apple to announce announced apple intelligence wwdc, as I did. But then just simply say apple intelligence is coming at a later date as a whole and be more truthful about that.
And then I would have been less confusing for users. These features could have been more polished and finished. But IT just feels so messy to me.
It's even like the system requirements for upper intelligence being a big bites of memory. Um and uh you need like A M serious chap, poor, you need a seventeen pro higher. But you get notification summaries on the apple watch which don't have any of those specifications.
So notification summaries are they in apple intelligence feature? Or are they know they don't meet your minimum requirements and is available the device that don't pull apple other machine learning type stuff a like say, sleep after detection, that is effectively a machine learning N A I what is going on there? But we're not calling that apple intelligence, but we're calling this other stuff apple intelligence.
And so even like predictive text, that is effectively similar algorithms. But we don't call that apple intelligence because if we did, so much stuff in the system would be called that. So why can't we just let IT be in the system? Why do we have to have everything be arbitrarily glowing to make us feel like something is shiny and new? Um IT just feels IT just feels rush to me IT doesn't feel like an apple thing.
This feels like a google type thing to me and just not if IT strikes me as the way that apple should behave. It's it's not bad. Nothing nothing is bad. It's just. It's just not apple like behavior.
I agree, having that I will say .
something positive, which is I think I quite like visual intelligence.
I was just bought to ask you quick, what do you think about visual intelligence? I don't know that I like IT because what I try didn't work out very well. But like if the things that they showcase at the event um or when they announced that feature um that seems useful and IT is easy enough to get to IT.
So I think it'll be fine. But like I just went outside and took a picture of a very famous building that's right outside of of my window, and I just gave me google image searches of that building. I get recognized IT.
But I didn't tell me what I was. I was just like a reverse image search basically. And then I asked ChatGPT at the time I didn't have that um turned on so I have to off to try again.
But like I wanted to see like a card style, you know where it's like the building is a location that people go to for a tourist attraction. So like pull IT up in maps and give me that kind of info. I didn't do any of that, so i'm a little confused. I need to test them more. I was just trying to get the video out, so I wanted to just get IT tested at least once um but we'll have to do some more .
on IT but yeah IT IT definitely needs to be able to do more. But occasionally where i've been trying IT a today IT has worked quite well so for example, I took a picture of A A letter I received um on a physical piece of paper and IT provided me with options such as a do I want to call the number on the letter?
Do I want to email the email address? IT also offered the uh summary of the entire letter and I also offered to read the letter to me that is all quite useful um where I think IT could be more useful is actually taking action based on that. So if I want IT to um uh read the summary allowed to me and then lets me make a the audio louder, I can't so it's actually if your audio is already set low, you do IT. I mean, not sure that's beta stuff, but I won't be able to say to O K. So pull up an email reply that that a is going to responded this way and that way in according to what I actually want to respond .
I can't able here is email .
address um yeah OK and likewise .
I found a useful safe i'd like .
sand the ah the back of a cereal box uh for the nutritional information and and I said to IT um how healthy is this cereal IT is able to actually provide a sixteen answer to that that stuff is quite cool um I quite like that yeah I just want you to what i'd like for example say with that cereal is if for me to say, okay ay i've had a serving of this cereal import that a information that nutritional information into apple health for me that's the kind of thing I just can't do yet um well, eventually IT probably .
will I ever do that but you .
think so IT should when IT can take action across different apps. But I guess this comes back to what I was saying before, where the danger to me of introducing these features in this way that you get people using them being a little bit of disappointed to or underwear by them and then they just won't go back and use them.
They don't make never used .
again because you're not blown away by IT. And I think unfortunately, apple has a tendency as a company to abuse software, which then just gets neglected. Um I mean, think of how many apple watch faces have not been updated over the years, how many very few watch places support the apple watch series tens uh refresh rate um which is crazy if you think about IT likewise say on the apple watch the the Walker talk p IT barely functions anymore that I was basically never have update since I was dead.
And so there's plenty of I think we can all point apps across all of apple's devices that we can say this has just been left behind. And so I really hope that some of this stuff does not just get left. Where is visual intelligence needs to keep growing? Um I don't need to get a lot Better so quite fast, otherwise i'm just probably not going to use IT.
Um I will probably forget about IT and I think a lot of people work in the same way. I think especially ordinary users like if I show maybe some of my friends, family who are not as intertech as I am a new feature or something they didn't know unless that feature blows them away in that second unless it's really easy to use and works really well, um they won't go back to IT. Um people ordinary people don't have time for unfinished stuff. true.
I feel like sometimes I fAllen to that too or it's like this doesn't work the way I wanted to or wasn't that cool. And try at once and then you just don't try IT again. And I think i'm an ordinary person, but I guess when IT comes to tech stuff, we're probably not considered that.
So if i'm having a hard time, I can't imagine you know my brother who likes tech stuff but he doesn't follow IT or care as much as most people um so even he will find some features that like I feel like he should know and he just doesn't and it's like how did I not know you can do all of these things and it's like, well, you know he just kind of have to look for IT which is true for some things that you shouldn't have to an apple should do a Better job of marketing some of these really cool features and not just the ones that have the glowing effect around IT that are not finished yet. We we need to work on that once still a little bit more will have more on the rest of the update. As you know, we get more features introduced.
But next week, if you are in the areas where eighteen point one will be available, tried IT out. And then let us know in the comments down below, like what you think of all of the apple intelligence features and everything, or headed us up on social media. And with that most heartlessness thing else left to say, we will catch anyone, everyone, in the next episode.