Apple Intelligence, the name for the AI features coming in Apple’s operating systems this fall, will launch with a relatively limited set of features. However, given how Apple Intelligence is so aware of what you do on your devices, it could do so much more.
First, let’s briefly review the Apple Intelligence features that are coming soon. Then let’s get creative and explore some ways that Apple could make Apple Intelligence even more useful and powerful.
Apple Intelligence is a suite of AI tools that Apple will be releasing this year in beta, as part of an update to iOS 18, iPadOS 18, and macOS Sequoia. It will contain some features that have become common among AI tools, along with a set of features that leverage knowledge about you, your contacts, your activities, and what you do on your devices.
There are four big components of Apple Intelligence that Apple says should arrive within the next few months:
Now that we’ve covered the features we already know are coming soon, let’s use our imaginations a bit.
What hypothetical features could Apple add to its AI tools suite in the future? Here are nine ideas we hope Apple implements in the next year or two.
Apple Intelligence could be used to find files more efficiently with natural language requests. For example, imagine asking: “Show me that PDF I got by email this morning,” or, “What was the web page I was looking at with used car prices?” This would entail Spotlight indexing more than just files, such as the content of web pages, but it would save a lot of time when you’re searching for files, emails, web pages, and more.
This feature might make the most sense on a Mac, but it could potentially work with an iPhone or iPad as well.
Apple does need to be cautious about how it approaches this one, though; Microsoft took a lot of flak for its controversial “Recall” feature, which takes constant screenshots to enable it to remember, for example, what you might have seen on a Web page. (We discussed Microsoft’s temporary “recall” of Recall on episode 349 of the Intego Mac Podcast.)
Apple Intelligence could create playlists based on natural language prompts, which could include mood, setting, artists, music, style, etc. For example, “Make me a playlist for a picnic, with upbeat music by Bruce Springsteen, Taylor Swift, and other artists, with no explicit lyrics.”
Apple Intelligence could also recommend music more efficiently than the current discovery process on Apple Music. For example, you could ask Siri or the Music app to “recommend new artists that I’ve never listened to that are similar to what I’ve been listening to this week.”
The TV app currently doesn’t make recommendations, but just highlights new Apple TV+ content and best-selling movies and TV shows in its store. Apple Intelligence could recommend movies or TV shows after you’ve finished watching something, based on your viewing history, and these recommendations could be more pertinent than the algorithmic recommendations of streaming services such as Netflix or Disney+. This would be especially useful if you could rate what you’ve watched.
It should be relatively trivial for Apple Intelligence to create a Keynote presentation from a bullet list. You could write an outline of a presentation, and Keynote could generate slides with transitions and effects based on your outline. You would choose a template, and Keynote would generate a presentation in seconds. It would allow you to add images, such as logos or product photos, get data from spreadsheets, and you wouldn’t need to worry about aligning items or balancing slides.
By learning users’ routines and habits, and by accessing real-time weather data, Home could automatically suggest activating and deactivating scenes and devices. It could suggest raising or lowering the temperature of a thermostat, knowing if anyone is home or not, starting to water a lawn or flowers, and suggesting when to activate or deactivate certain lighting scenes.
Apple Intelligence could create personalized travel itineraries based on user preferences, past travel history, and current trends, including recommendations for restaurants, attractions, and activities.
Many websites and apps (for example, Tripadvisor) try to do this, but aren’t very effective. Apple could hypothetically offer a different approach, taking into account all of your traveling history, recorded via the Maps app, and your preferences over time. As you progress on your vacation, you could rate the suggestions, which could help improve your recommendations.
Apple Intelligence could crunch data from the Apple Watch and the Health app to provide personalized health and wellness insights. This could include predicting health trends, offering customized fitness plans, and providing mental health support. The current “nudge” reminders on the Apple Watch are not useful, and making personalized suggestions based on people’s habits would allow them to be more actionable.
Apple Intelligence could automatically organize reminders based on user habits and context, such as suggesting that users complete tasks when they arrive at a specific location or at certain times of the day, or taking into account existing events and previous interactions with others. For example, if you create a reminder to “Call Sonia tomorrow,” your device could remind you to make the call at a time when you usually speak to that person, or when you’re in between meetings.
And one more thing: Apple Intelligence could potentially bring old photos and videos to life by making them three-dimensional Spatial Videos. This makes them absolutely stunning to view on Apple Vision Pro (or Meta Quest, or other VR/AR headsets or glasses). It’s something you have to see to believe, but it’s already possible with other tools; we’d love for Apple to make this feature a native part of Apple Intelligence.
This is insane!
This is a 720p (2D) video I shot of my kids 15 years ago! I used AI to upscale it, depth-map it, & used the depth map to convert it to 3D SBS, & then converted that to Spatial Video for Apple Vision Pro (MV-HEVC). My son is currently a senior in high school,… pic.twitter.com/8kBvo4CWzp
— Blaine Brown (@blizaine) February 8, 2024
The Apple Intelligence that we’ll see in late 2024 is just the beginning of a new era of personal assistance on Apple devices. It’s obvious that, as this sort of AI improves, Apple will be using it for other features—many of which are contextual, based on your device, your activities, and your files.
You can also subscribe to our e-mail newsletter and keep an eye here on The Mac Security Blog for the latest Apple security and privacy news. And don’t forget to follow Intego on your favorite social media channels: