Search This Blog

Sunday, May 19, 2024

Control an iPhone or iPad with your eyes

 Apple reveals Eye Tracking control of an iPhone or iPad and other new features 

Tim Cook's American tech giant aims to provide "best possible" experience to users.  Apple users will soon be able to control their iPhones and iPads with their eyes after the tech giant unveiled an array of new accessibility features for their products. They’re putting the “eye” in iPhone. “Eye Tracking,” which is designed for users with physical disabilities, will be powered by Artificial Intelligence and won’t require any additional hardware. Apple rolling out new features for iPhone, iPad and revealed new accessibility capabilities which will be available later in the year in which Eye Tracking is the most prominent one. Eye Tracking enables people with physical limitations to operate an iPad or iPhone using their eyes.

According to an announcement by Apple, the feature will use the front-facing camera of an iPhone or iPad to trace a user’s eye movement as they navigate “through elements of an app.” “We believe deeply in the transformative power of innovation to enrich lives,” CEO Tim Cook declared in the media release. In addition, more accessibility features will be added to visionOS; Vocal Shortcuts will enable users to complete tasks by creating their own unique sound; Vehicle Motion Cues will lessen motion sickness when using an iPhone or iPad while driving; and Music Haptics will provide a new way for users who are deaf or hard of hearing to experience music through the iPhone's Taptic Engine.

“We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users.” Cook didn’t disclose exactly when the Eye Tracking feature will be available, but is “likely to debut in iOS and iPadOS” which will be released later this year. The tech giant also announced a Music Haptics feature, which will allow users who are deaf or hard of hearing to experience music through taps, textures, and refined vibrations matched to audio. With the help of Apple silicon, AI, and machine learning, these features combine the power of Apple software and hardware to promote the company's decades-long goal of creating products that are accessible to all.

“We believe deeply in the transformative power of innovation to enrich lives,” said Tim Cook, Apple’s CEO. “That’s why for nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software. We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users.” The accessibility feature is one of several soon set to come to iPhones and iPads. Using AI, the feature will recognize unique user speech patterns and translate them into instructions that Siri can understand.

Meanwhile, an Atypical Speech feature will be available for those who suffer from speech impairments. “Artificial intelligence has the potential to improve speech recognition for millions of people with atypical speech, so we are thrilled that Apple is bringing these new accessibility features to consumers,” Professor Mark Hasegawa-Johnson from the University of Illinois Urbana-Champaign declared in the company’s release.








No comments:

Post a Comment

Health Benefits of Pomegranate

  Health and Nutritional Benefits of Pomegranate Pomegranates, regarded for their distinct flavour and nutritional value, offer numerous hea...