Complete News World

Apple introduces a revolutionary function: the iPhone imitates our voice

Apple introduces a revolutionary function: the iPhone imitates our voice

(Photo: Evan Palvan)

In the spirit of inclusion, Apple once again demonstrates its innovative strength: the tech giant recently introduced a slew of new functions aimed at cognitive, visual, audio and mobile access. The Personal Voice feature, which gives people the chance to save their voice if they are in danger of losing their ability to speak, deserves special attention.

The Personal Voice feature allows users to create a synthesized voice that matches their voice and use it to communicate with friends and family. To create a personalized voiceover, users only need 15 minutes to fill in a predefined text into a file iPhone or iPad Read out loud.

Thanks to integration with Live Speech, they can then type in what they want to say and Personal Voice will “read” it back to the person they’re talking to. Apple asserts that the feature uses machine learning to keep users’ information private and secure.

(photo: iStock/hirun)

As if that weren’t enough, Apple also offers updated versions of its core apps called Assistive Access, which are specifically aimed at helping people with cognitive impairments. The redesign of these apps aims to make their core features and functions easier to access and reduce cognitive load. Examples include bundled versions of the Phone and FaceTime apps, as well as customized versions of the Messages, Camera, Photos, and Music apps that feature high-contrast buttons, large text labels, and additional accessibility widgets.

At the end of last year, experts found information about the “adapted accessibility mode” in the beta version of iOS 16.2. Now, Apple has confirmed that the new features will be available to users “later this year,” hinting at the arrival of iOS 17.

See also  Windows 11 major update with taskbar and Android app improvements in February

Another step toward inclusion is Magnifier’s new recognition mode, which is designed specifically for users who are blind or have low vision to help them interact with physical objects that have text labels on them. For example, a user can point their device’s camera at the controls on a microwave oven, and the iPhone or iPad will read the labels aloud as the user runs their finger over numbers or settings on the device.

Apple also announced a number of improvements for Mac users: The deaf and hard of hearing can now pair their Made for iPhone hearing aids to the Mac. In addition, it becomes easy to adjust text size in Finder, Messages, Mail, Calendar, and Notes apps on Mac.

Additional features, such as the ability to pause GIFs in Safari and Messages, adjust Siri’s speaking rate, and use Voice Control for voice suggestions when editing text, complement Apple’s existing accessibility features for Mac and iPhone, which include Live Captions, and VoiceOver screen readers. , revealed the door, and the account of others.

In times when inclusion and accessibility are becoming more important, Apple’s new features once again prove that the company takes these issues seriously and is constantly improving its products in this area.