Every year, with each new iteration of its mobile operating system, Apple adds new accessibility features to the iPhone, and iOS 16 is no different. This time, Apple is adding Live Captions to FaceTime calls (taking cues from the third-party Navi app, which we have covered before).
This feature will instantly transcribe speech for a FaceTime call, helping users who have difficulty hearing, although, like many of Apple’s accessibility features, it can be used by anyone. Live Captions can also be helpful for people who have difficulty following English, or for people who are stuck in loud environments.
Note: This is a new feature in iOS 16 public beta, and will be available to all iPhones with the stable update shipping fall 2022.
How to enable live captions during FaceTime calls
Live Captions is an accessibility feature that needs to be enabled manually, and works for both FaceTime as well as standard video apps.
To enable it, go to Settings > Accessibility > Live Captions (Beta). First, enable the “Live Captions” feature, then enable the “Live Captions in FaceTime” option.
From the “Appearance” section, you’ll be able to customise how the subtitles appear. Here, you can increase the font size, make the text bolder, and even change the colour, helping make the captions easier to read.
Now that the feature is enabled, Live Captions show up in a bubble at the top of the FaceTime call. Here’s what it looks like:
What’s cool is that this works during a multi-person call as well: Apple will automatically attribute the Live Captions to the speaker. As we mentioned, Live Captions show up as a floating bar that you are free to move anywhere on the screen. If you only want to see the text, there’s a full-screen mode as well.
And as Apple is generating these Live Captions on device, you don’t need to worry about a third-party service, storing or processing your voice data.