Recapping WWDC 2024 from Apple Park

Technology

There was no new Apple hardware at WWDC 2024, but Apple still had tons of news around AI and its upcoming operating systems. In this bonus episode, Cherlynn and Devindra brave the California heat to discuss Apple Intelligence and how it’s different than other AI solutions. And they dive into other new features they’re looking forward to, like the iPhone mirroring in macOS Sequoia and iPadOS 18’s surprisingly cool Calculator app.


Listen below or subscribe on your podcast app of choice. If you’ve got suggestions or topics you’d like covered on the show, be sure to email us or drop a note in the comments! And be sure to check out our other podcast, Engadget News!

Hosts: Devindra Hardawar and Cherlynn Low
Music: Dale North and Terrence O’Brien

Devindra: What’s up, folks? This is Devindra here, and we are live at Apple Park. Cherlynn and I are in the middle of covering Apple’s WWDC conference. Cherlynn, what’s up? How’s it going?

Cherlynn: We are, I feel quite zen right now, because even though I have a lot more meetings coming up, we are seated outside, it’s nice out, and even though it’s really hot, it’s not dying. it’s nice. I’m chill.

Devindra: It’s nice we are both, we’ve gone through four to five meetings. For both of us. We’ve gone through the keynote. We’re writing a bunch of news folks. So we’re just gonna sit down and Give you our thoughts about what’s going on. Cherlynn and I also did a video that’s up on our YouTube channel recapping why we think Apple intelligence is doing things a little differently and maybe better than Stuff from Microsoft and Google, but yeah, Sherlyn, you’ve been talking with Apple a lot.

What is your general takeaway from this year’s WWDC?

Cherlynn: Yeah, to set the stage a little, I think, this morning 8 a. m. I had my first meeting and then it’s been four meetings like you said, Devendra, covering topics like Apple intelligence, privacy, iOS 18 and iPadOS 18 and watchOS 11 as well. My main thing is that yes, we have actually throughout the keynote, we heard things that we’ve seen in other platforms, right?

Like they’re blatantly copying magic eraser from Google’s editor on, this thing called cleanup and photos. and they’re adding different things like, oh, you can now rearrange your apps and skin them the way you can in Android’s material you. But, the way Apple’s thought things through proves and continues to prove to be different from everyone else.

It’s a bit more thoughtful, a bit cleaner, a bit more sophisticated. And, again, I think you see this most in Apple Intelligence. And, Devindra, you’ve been asking everybody here, can we say AI? Can we?

Devindra: I don’t know. So one thing I started figuring it out, or at least as we were writing about Apple Intelligence, is that Making an acronym for it is tough because I can’t just call it AI and then talk about That stuff versus Copilot or versus OpenAI and I’ve started using Apple AI as a way to shorten it But I have been asking Apple folks here basically everyone we’ve encountered about how they shorten Apple intelligence and the resounding response I get is like a data processing error.

It’s like watching a human kind of just like stop being able to process information. They look over to the PR person. They’re like, what do I see here? But the response I always get is, Apple intelligence. That’s all we say. We only ever say Apple intelligence. One person said, personal intelligence, which is a phrase Tim Cook used.

But yeah, it is funny that it is, it seems almost like a corporate command not to call Apple intelligence AI or shorten it that way in any way.

Cherlynn: they think of the words AI, or the letters AI, to stand for Apple Intelligence, it seems And then the word that they fall back on when they don’t want to say Apple Intelligence is just Intelligence. three syllables, is only one more than saying AI. Still, though, AI is so much easier to say, in my opinion.

Devindra: It just feels like they have stumbled themselves into this weird branding hole, where they took the letters A and I, but they can’t use AI. But it is also an AI powered thing. I just think it’s funny, and shows, like, how I don’t know how absurd these companies can be at some points. but yeah, let’s briefly talk about Apple Intelligence, Cherlynn. I’m more impressed by what Apple’s doing here because it does seem like they’re announcing features that we actually would want to use, and it’s more centered on features within apps, stuff like making Siri better, rather than what Microsoft did.

Microsoft was just like, hey, nobody likes our search engine. Here we put AI in our search engine then everybody all of a sudden thought it was cool And then they put that copilot they rebranded as copilot They put that in Windows and it’s like dot profit I don’t think it actually led to anything.

I don’t care about copilot in Windows It hasn’t been functionally useful for me But just looking at the stuff here that Apple has shown off like I want to use this new Siri I want to use a lot of these new features that they’re showing off. I don’t know if you feel differently

Cherlynn: I think Siri is only one part of the Apple intelligence puzzle.

I think there’s a lot of other stuff that they were, that they demoed that would be very intriguing. I do feel like a lot of their writing tools, things that we’re going to see on Mac and iPad, are things we’ve seen elsewhere, like Copilot, like Gemini and OpenAI, have all offered some version of rewriting something for you, summarizing it for you, providing a TLDR.

Apple obviously being the sort of, vertical integration king that it is good at Bringing it so that when you highlight a body of text or something, you can see this like blue or yellow or whatever circle up here at the top left, where you find your copy and paste options, you might also go there to get a writing tool like, yeah, help me adjust the tone of this cover letter that I’m writing, for example. it’s Stuff that we’ve seen, but yet applied a just a bit better, a bit more thoughtfully. the Siri stuff, they’ve redesigned Siri to better understand you if you, interrupting yourself in the middle of, issuing a command. if you’re like, Oh, adjust this timer. Oh, sorry, set it for 15, not 20 minutes, that sort of thing. it will do it. It’s smart enough. it is definitely, more thought out and more system wide and deeply integrated. And to use their own words, Personal, more personally, contextually aware than say, Gemini on a Pixel phone. And that’s the only real other place I can compare it to because the Copilot on Surface PCs don’t seem that deeply integrated just yet.

Devindra: It almost seems like Copilot is directionless. It’s like Microsoft was just like, Hey, OpenAI is cool, do you like ChatGPT? look, we put it in Windows. are you not entertained? Aren’t you happy about this? And I wasn’t, I’ve tested this stuff for a while. I think Google’s at least trying to be a little more, Thoughtful also about how it’s doing it with Gemini.

Like it’s trying to like hook into all the Google services and all the stuff you’re already relying on. But Apple’s whole thing is like they are building on the privacy standards that they have talked about before a lot of this Processing is happening on device with their local models They do go to the cloud for some of their like more complex things.

But we also read about the what is it the private cloud? connection that they were talking about and even that seems cool I’d recommend you all it’s like weird to even discuss something like this, but they have basically Created a cloud solution that is they say is more secure You it’s an anonymized connection.

Like when, first of all, Apple’s models only send little bits of data to their cloud. It’s anon anonymized in a way, like the, the VPN relay thing that they have on iPhones is, these servers don’t save your data. They don’t save logs. So that’s also something that will prevent, authorities like police or the FBI from getting records of what you’re doing. but Apple’s just like keeping yourself out of that. And they also say that they’re publishing the images. of the software being used on the servers for researchers to audit and to take a look at and your phone can only talk with basically servers running the exact software that they expect it to so your phone will have to keep getting updated there’s like just multiple layers of security which is not the sort of thing I think most people think about when they’re doing like cloud services, at least from what I’ve seen.

Cherlynn: I think, so one thing, the irony of Copilot being directionless is just quite funny to me. you don’t want a Copilot or a Pilot to be directionless. But anyway, yeah, the private cloud computing is definitely something that, Apple is approaching differently compared to Microsoft and Google, where they explicitly lay out how anonymized and how protected and encrypted your data is. and true. Apple’s point, which something that Craig Federighi pointed out during the keynote as well, they actually put it out there. They want independent verification and validation that their stuff is securely happening and all the transfer of your process. for example, right after the keynote, a certain co, CEO or owner of a certain social media network or platform was like, OpenAI integration with this thing is going to be a security risk, right? I am referring to Elon Musk’s tweets on X. And, from my understanding, having taken a lot of meetings since, the OpenAI integration is happening like this. whenever you ask Apple Intelligence devices a query, Siri for example, the first thing it’s doing is figuring out whether it can do it on device or if it needs to send through privacy cloud compute to the servers to process.

Then, on devices obviously like quite direct, right? But if it needs to pass your information on to chat GPT because you’ve asked it something that, whatever. It will first surface hey, do you want to pass your information to chat GPT? And it will do it every time. you’re not going to say yes once and then afterwards never have to be asked chat GPT access again. and then there is a contractual thing between OpenAI and Apple which prevents OpenAI from storing your requests. and also Apple is just not handing any IP address information over. It is using that sort of private relay thing, to pass on like any IP address information needed. It’s just hiding the actual info. and then once OpenAI has done chat GPTing your answers for you, It is supposed to erase your information or whatever and get rid of it. It is a contractual thing supposedly, and that remains to be seen, right? that’s how much you trust OpenAI to do that. and then the similar sort of concepts for privacy, cloud compute here.

So again, very well thought out, right? just very Apple in its approach.

Devindra: Thoughtful, I think, is the word. I don’t want to sound like I said this in the video. Not too much of a fanboy. We have not seen this stuff in action yet or in the wild. But, I think like the initial I don’t know. Problems we saw was something like Microsoft Recall, which was a cool idea.

But Microsoft, it was like a blunt force approach to Hey, we’re just going to remember everything you did on your computer by capturing everything we did on your computer. And we’re going to serve it, we’re going to save it in a database on your system that, Hey, anybody can apparently access with like very little protections around it.

And it literally took days for security researchers to even be like, what the hell is this? this is. Very easy to break through. Microsoft ended up having to, basically rework how that feature is. Initially it was, it was a feature that was always enabled and you had to opt out of it. Now it’s opt in. people had to complain to alleviate these very obvious issues. And I think at the very least, I don’t have that sense with Apple. Like I feel like they’ve at least sat down, maybe also talked with researchers and be like, is this cool? Is this actually copacetic in terms of like privacy and user safety and everything?

So I don’t know if you have any further thoughts on that.

Cherlynn: Because Apple knows that the sort of price to pay if it’s caught with egg on its face is so high and actually arguably higher for it than any of its other rivals to be, is all the more invested in making sure this is going and being done the right way and honestly I wish Google and Microsoft would take notes. I will say there’s a lot of other privacy things that are very intriguing to me. I did, I am fresh from like a privacy related demo that, was very, so the passwords app is a new thing that I’m very excited about, I’m very welcome. I feel very welcome. Or I’m welcoming it? Whatever. they’re also changing certain things like the allowing access to all your contacts or limiting access or whatever in the permission settings for various apps that need.

Like for example, if I’m playing Match Factory, why do they need to get all the access to all my contacts they don’t need? I also think it’s funny, no, maybe not funny. one of the new features coming to iOS 18 is locking and hiding. specific apps. locking makes sense. I get it. Hiding, though, seems like you’re, like, Ashley Madison ing things for everybody, letting the cheaters of the world keep secrets.

I don’t know how I feel about that, but, it’s the, atomic bomb thing, right? Do you make it and then let people use it how they will, or, I don’t know who at Apple decided this was a necessary feature. Are you going to use this feature Devendra?

Devindra: listen, I could have my Tinder account somehow.

I don’t know. but I do think the app blocking thing is cool because parents often have to give their devices to kids and Oh, you don’t want them to swipe away, you don’t want them to do other stuff. So this way you can lock an app if you want to show off your photos or show off something to somebody and just have them not poke around, which has been.

It’s also like a very common problem we see on like TV shows and movies where somebody would be like, Hey, can I take a look at this photo? And they’re like, get all your personal data from your phone instantly because it’s open. So it seems like a very smart way of dealing with privacy too.

Cherlynn: And very Apple esque in that, if you lock an app, say, your messages, for example, then it, also the contents of that app won’t show up, and search won’t show up, and series suggestions, or spotlight suggestions, or, even map suggestions. there’s just a lot here. and, just to take away broader notes from WWDC2, like I said earlier, there’s a lot of, small changes that, that make everything seem very meaty. iOS 18 actually might be a big upgrade. the messages updates that are coming. the new tap back, emoji.

Finally, we can do more than exclamation marks. Sometimes I just want to make a sad face. I can’t do that. I have to do thumbs down. I like that they’re coming. Oh, and I’m back to the Apple Intelli I know I’m jumping around a bit, but talking about emoji, another thing that Apple did right from the get, I think because it’s been able to observe the pitfalls that other people have fallen into is to be like, okay, we’re limiting it to these very obviously cartoonish sort of graphic like representation, so no photorealistic.

And then when it’s creating images of people in gemmoji, you can only use your own creation. So you’re basically choosing from a template or based on your like people that you have in your photos or your people gallery sort of situation. But because it’s in a cartoonish representation, people are never going to mistake it for someone that’s actually a real life.

You can’t, for example, there are guardrails in place that like, prevent you from making the image playground generate something that looks harmful or violent or is exploitative. which again, goes to show, Apple’s thought this through, right?

Devindra: And I think a lot of people are asking, what are, where, what are these models trained on?

Because Apple’s talking about a lot of its own models, small ones that run directly on your device, larger ones that are in the cloud. And occasionally they’ll reach out to OpenAI for chat GPT stuff. Apple has told us that they are training their models on licensed data, like images, things like that.

Some stuff from the open web, publishers can refuse to participate. Like they can say their site is not crawlable by Apple stuff. and they say that, Apple will, if somebody changes down the line, like what they want to be accessible to their own models, Apple will reflect that with further updates.

So again, it’s opaque, but at least what they’re telling us. To me sounds better than what I’ve heard from Google and certainly from open AI. so I think that’s cool real quick. Let’s talk about macOS Sequoia, which has most of the features again All these features pretty much come across all of its products.

So apple intelligence is going to be a thing that’s by the way is going to be working on Max running Apple Silicon, so M1 to M4 No M4 Max yet, but M1, M series Max and also the iPhone 15 Pro Sherlyn you wrote a piece about the features people can expect if you have an iPhone 14 Pro Basically, you’re sore out of luck.

You get some iOS 18 features, but not everything, right?

Cherlynn: Yeah, all the iOS 18 features, but none of the Apple intelligence features, unfortunately So that redesigned Siri with the glowing edges. That’s not coming. It is so pretty I also want to say that the iPadOS things that seem really cool, all the pencil features, the handwriting stuff.

So a lot of the, Keynote or like in demos I’ve taken, some features will be like, this is ML power, it’s like Smart Script, for example, is powered by ML, but it’s not part of Apple Intelligence. So you are still going to get that in iPadOS, 18 when you upgrade, regardless of whether you have the M1 iPad or like an older one. but yeah, I gotta talk to you about MathNodes, Devindra. Were you blown away by that demo? Like, when they just draw the equal sign, then that, thing just sums itself. The solution just solves itself. It’s Mean Girls Mathletes, but on a whole other level to me.

Devindra: It’s, it’s cool, and that’s also something they say is ML powered, not necessarily Apple Intelligence powered.

So if you have older iPads, you will see some benefits of that. It’s cool, but I also feel like, Bye for now. It’s like a superpowered calculator. I don’t know how many people have Apple Pencils and are like scribbling down math formulas, but it’s cool. I dig it.

Cherlynn: I think to, to begin with the fact that an iPad never had a calculator app before this, it’s like astonishing.

But then now that it’s here, Apple’s clearly thought about look, we’re bringing this to the bigger screen. We want it to be pencil friendly. We want it to be big screen friendly. Let’s really think about the layout here. And this is explains why there’s been a delay. And I actually get it. there’s you can go into the history tab to see your previous like calculations. There’s a lot more calculations you can do on this calculator, A currency conversion, which I forgot to ask, like, how is it pulling the actual rate? But whatever, and then you can go into the notes section and then I feel like almost feel like calculator is a misnomer in this case because it’s doing way more than calculating and solving equations.

It’s like you can draw like a blueprint of a house and have it measure the areas like length and width, whatever, but at the same time, map that to like price calculations, like price estimates, like if you use this material. So something I saw happen was like you did Price equals X, area equals Y, and then price times area equals, and once you draw the area, it’s like programming basically, but all done in the notes app.

And that’s really it blew my mind a little bit, which I hate to admit because I don’t like to be so like, fangirly, but damn, that was cool.

Devindra: It was cool. Maybe the excuse for not having a calculator app built into the iPad. one thing I want to mention about macOS Decoy is iPhone mirroring. Which was something I like half predicted.

Like I wrote a wish piece for what I would want to see in Vision Pro and Vision OS 2. And one thing was I would really love to be able to mirror the iPhone just the way like you can mirror a MacBook inside Vision Pro. have a full projection of the screen. That’s not coming to Vision OS 2. Vision OS 2 is like very minor updates it seems.

But It is something coming to Mac Sequoia and to use that you need to use a phone, an iPhone. With I 18 A Mac with Mac, Sequoia, you get almost instantly. I haven’t seen projected, like how it actually works in real time, but it does seem like you hit a button, you get a window view of your iPhone and you use it.

On your Mac as you would in real life you see your home screen you can scroll between apps You can your notifications like very smartly are reintegrated into the max notifications. That’s fun you could play games on your iphone and when you launch a game, the window will go widescreen The audio seems to come through pretty quickly just seems like a really cool feature because At least on Macs, like I always have my phone nearby.

It’s always like doing other stuff, but I would love to be able to like just have that open and also see other notifications coming in. It’s just like very extensible in terms of like how you’re interacting with your hardware. The iPhone, by the way, stay the screen stays locked. So it doesn’t look like somebody is just like you’re just like mirroring a direct computer or something.

One thing we learned from Apple because I’ve been asking around about this. If you mirrored your Mac inside of the Vision Pro, And then that Mac was also mirroring an iPhone. Would you actually be able to do the iPhone stuff from within the Vision Pro? And I’ve heard from a couple of folks that is basically not going to happen.

You can run one continuity feature. iPhone mirroring is part of continuity. So those features that let you like copy and paste across devices and stuff. You can only run one mirroring feature at a time. So basically you can’t do that with the Mac. I’m still sitting here waiting for iPhone mirroring in Vision OS.

Clearly though, like they have the capability. The Vision Pro is running an M. 2 chip. If iPhone mirroring works on M. 1 and M. 2 Macs, there is no reason why that isn’t going to be in Vision Pro eventually. So I’m kinda, I feel like we half won that bet, basically.

Cherlynn: I just gotta say that Windows and Android have been trying to do this forever.

I can remember like years ago though, when the first like Galaxy books try to do this. That’s at least my earliest encounter with it. It works. I haven’t seen this happen, yet on the iPhone side of things, as in I haven’t personally taken a demo, so I couldn’t tell you if it’s actually better or more thoughtful. knowing Apple, knowing its deep integration prowess is probably gonna work better.

Devindra: probably. Apple also gives very good demos, so that’s something we’ve learned. do you have any further thoughts about WWDC or what’s ahead for Apple, Sherlyn?

Cherlynn: I have so much to dive into in detail, like the watchOS stuff, the iOS and iPadOS features.

I guess broad strokes. It’s it feels almost revolutionary because Apple is finally jumping on board the AI train, and renaming the train, taking over basically. And you know what? The thing is, I hate this, but now people are going to pay attention. Because what? now my friends actually are going to realize what Genmoji is supposed to do, what you can actually do by, feeding AI, generator.

It’s gonna, and we’re gonna start seeing more, writing tool assisted emails and reviews, I think it’s momentous. I think people are really going to start paying attention to what AI means and what it can do. I don’t know if it’s good for the world, but yeah, it just feels like big.

Devindra: We ran out of time talking to Apple people, but I did want to ask them, do you think these writing tools are actually helpful?

Because then it just, All our emails, all our conversations are going to start to sound like weirdly robotic or extra formal AI documents or AI texts and I’m not a fan of that, not too interested, but the Genmoji stuff is cool because it’s like we have had Dali and other things like create these AI images, what do you do with them? post it on social media? I don’t know. Genmoji is just if you want to create an emoji based on a specific feeling, you can create a thing to your liking. Just a really smart use of that. That technology, So anyway, I am in the process of installing the, the iOS 18 developer beta on my phone. I think according to the rules, you can’t talk about that, but we can talk about it when they launch the public beta. So that’s, later next month, I believe. But we’re going to be testing this stuff out. We’re going to be thinking about these features. any other takeaways from Apple, Sherlyn?

Cherlynn: No, send us your thoughts though, right? podcast@engadget.Com is the most direct way to reach us. but come back on, we do a Thursday livestream at podcast@engadget.com Eastern on our YouTube channel where we have direct Q& A sessions where we can probably answer your questions, in real time. And I’m pretty sure we’ll still continue to dig deep into what we learned, this week on our episode, that drops on Fridays or Thursday nights, right? come back for all of that.

Devindra: Yeah, definitely. We’re still gonna be doing a longer, a normal podcast episode this week. Cherlynn and I are in California now, but we’ll be flying back tomorrow and ready to podcast on live stream on Thursday. So we’ll be back folks. let us know what you think about all this news, podcast@engadget.com. Thanks folks. We’re out.

This article contains affiliate links; if you click such a link and make a purchase, we may earn a commission.

Read the original article here

Products You May Like

Articles You May Like

House of the Dragon’s ‘Blood and Cheese’ Scene Leaves Viewers Cold
Tip of the Day 6/14
Boeing and NASA delay Starliner astronaut return to June 22
Everything We Know So Far About New Version Starring Kathy Bates
18 Anti-Trend Pieces From M&S I’m Adding to My Minimalist Wardrobe