Here’s what your iPhone 16 will do with Apple Intelligence — eventually

/

Apple Intelligence will miss the launch of the new iPhones, but here’s what’s coming in the iOS 18.1 update and beyond.

Share this story

The Apple logo with a little AI sparkle.

Image: Cath Virginia / The Verge

Apple heavily sprinkled mentions of AI throughout its iPhone 16 event on Monday. However, generative Apple Intelligence features won’t be ready for the public launch of iOS 18 on September 16th or the new iPhones when they’re released on September 20th.

The first set of Apple’s AI features is scheduled for public availability next month in most regions — except the EU — as part of a beta test for iPhone 15 Pro and all iPhone 16s, plus Macs and iPads with M1 or higher Apple Silicon chips. At launch, they’ll be available in US English only.

What’s coming to Apple Intelligence in October

Writing Tools

  • Text Rewrite: Text Rewrite will morph your email writing draft into a more professional one, and you can change the tone to be friendly or concise as well.
  • Proofread: As in real life, this proofreading feature should correct your grammar and sentence structure and suggests better words throughout your work.
  • Summarize Text: It will be like letting AI do a tl;dr for you. Summarize Text will shorten your writing to just the important parts or create a bulleted list or table.
Screengrab of an AI generated table in iOS 18

Screengrab of an AI generated table in iOS 18
Apple Intelligence made this table from a scruffy list.
Screenshot: Allison Johnson / The Verge
  • Smart Reply: We’ve seen this AI feature shown off quite a bit. Smart Reply will give you a few contextual suggestions to get you started on a reply in Mail or elsewhere.
Animation showing Siri taking an address from a message and adding it to a contact card.

Animation showing Siri taking an address from a message and adding it to a contact card.
The new Siri glows up.
GIF: Apple
  • New look: On iPhone, iPad, or CarPlay, Siri will appear as a rainbow ring around the edges of the screen, and on Mac, Siri can float and be placed anywhere on your desktop.
  • Apple’s new language model: Siri should also get a bit smarter and better at parsing natural language thanks to Apple’s on-device language model. Meanwhile more complex questions will be sent to Apple’s “Private Cloud Compute” server, which Apple claims acts as a computational extension to your device and does not retain any data.
  • Type to Siri: Instead of talking, you’ll be able to type questions to the assistant anytime.
Screengrab showing Siri ‘s text input option in iOS 18.

Screengrab showing Siri ‘s text input option in iOS 18.
Sometimes you don’t want to talk to your phone to get stuff done faster.
Screenshot: Allison Johnson / The Verge
  • Clean Up: Similar to Google’s Magic Eraser, Clean Up will remove unwanted objects in your photos.
  • Search: You’ll be able to search for photos using natural language to find specific subjects you’re looking for but can’t find scrolling through your library.
  • Memories: You’ll be able to make a movie using media from your Photos library by writing out a prompt, and it should create a narratively driven story with chapters.

Transcription

  • Phone call recording and transcription: You’ll be able to record phone calls and get a transcription of the whole call. Activating this feature will tell all parties that the call is being recorded.
  • Voice recordings in Notes: You’ll be able to record audio within the actual Notes app, and it will transcribe speech into text. You can also use Apple’s other writing tools to help summarize the whole session.

These Apple Intelligence features are arriving later

Apple says other AI features will “roll out later this year and in the months following.” That means these features could arrive as soon as October or they could arrive next summer or fall. Unfortunately, these are also some of the most eye-catching features coming to Apple Intelligence.

A combination of using Apple’s Visual Intelligence and OpenAI’s ChatGPT to analyze a photo of a Mediterranean coastline and how to set up a photoshoot there.

A combination of using Apple’s Visual Intelligence and OpenAI’s ChatGPT to analyze a photo of a Mediterranean coastline and how to set up a photoshoot there.
A combination of using Apple’s Visual Intelligence and OpenAI’s ChatGPT to analyze a photo of a Mediterranean coastline and how to set up a photoshoot there.
Image: Apple
  • Visual Intelligence: Apple’s new Visual Intelligence introduced during the iPhone 16 presentation can search for things by just snapping a photo. You could, for instance, take a picture of a cafe storefront and get information about it, like hours and its menu, or take a photo of a concert poster and add it to your calendar. Visual Intelligence, when it arrives, will be activated using the Camera Control side button on iPhone 16 and 16 Pro.
  • Genmoji: You’ll be able to create your own emoji by entering a text prompt, and Apple’s image generator will make you a new emoji you can send to friends.
Make your own emojis.

Make your own emojis.
Make your own emojis.
Image: Apple
  • Image Playground: In addition to making custom emoji, Apple Intelligence will also eventually create custom images. Enter a text prompt for whatever image you’d like (assume some actual restrictions will apply), and Apple’s models will conjure up a picture for you.
  • Siri Personal Context: Siri’s usefulness will evolve later by contextually helping you with onscreen information on your iPhone, iPad, or Mac.
  • OpenAI connection: Anywhere there are Apple Intelligence writing tools, you will also have the option to use ChatGPT for additional generative AI options. ChatGPT should also be able to process your Siri requests for more advanced answers to questions.
  • Third-party app connections with Siri: Apple’s also promising Siri will one day complete in-app requests, like making photo edits in an image editing app using pictures in your Photos app.

Go to Source