Apple to Introduce Eye-Tracking Control in iOS/iPadOS 18: A Leap Forward in Accessibility
Just before the Global Accessibility Awareness Day this week, Apple made its annual accessibility feature announcement, introducing a range of new support functions aimed at assisting people with disabilities.
A significant highlight is the introduction of eye-tracking functionality (eye control) in iOS 18 and iPadOS 18. This feature enables people with disabilities to operate the screen through air gestures using their eyes. It facilitates movement across various applications, games, and menus, and allows for selections by simply lingering on an element.
This eye-tracking capability is not entirely new, as it has been available in Mac's accessibility features under the name "Dwell Control." The setup and calibration process, powered by artificial intelligence, takes only a few seconds and is designed to understand the user's gaze direction intuitively.
Unlike its implementation on the Mac, which serves as an intermediary layer requiring specific hardware devices for support, the eye-tracking feature in iOS 18 and iPadOS 18 will need no additional hardware. It is supported on all devices equipped with the Apple A12 chip or later, whether iPhone or iPad, utilizing the front camera for recognition.
Beyond eye control, Apple is also enhancing the usability of voice control on the iPhone and iPad. Users can set simple commands for Siri to perform specific actions. Another notable improvement is the ability to listen to atypical speech. This feature employs machine learning to recognize speech sounds emitted by the user and execute commands based on unique speech patterns. This development aims to assist users with speech impairments or those who use non-standard speech, facilitating easier device operation.