- This technology is made possible by the device’s camera and AI
- Apple’s next feature for the iPhone will help you avoid car sickness
- The first MacBook with a foldable screen would arrive in 2026 with an M5 processor and up to a 20-inch display
Nowadays we have become accustomed to them and they don’t seem like much anymore, but not so long ago touch screens were an incredible technological revolution and thousands of people took time to get used to. They’re now the norm, and if a device doesn’t have a touchscreen it seems to be lagging behind technologically.
And as is the case in this sector, companies and researchers are working to find the next great invention and technology with which we interact with smart devices. Voice has emerged as a very viable option, and now with virtual assistants like Siri or Alexa we can ask them to do numerous tasks for us without having to lift a finger.
But technology always aspires to more, and although it seems impossible right now, there are more and more projects that allow us to interact with these devices without saying or moving anything, just look at it.
This is how Apple explained the new feature they introduced last Wednesday in honor of World Accessibility Awareness Day, designed for those people with different disabilities so that they are not excluded from technology.
The Eye Tracking tool consists of a system powered by Artificial Intelligence (AI), which will allow users to navigate their iPhone or iPad with just their eyes. To do this, this tool takes advantage of the device’s internal camera together with the device’s built-in machine learning system to understand what the user wants to do with just their gaze.
Undoubtedly, what attracts enormous attention, and if it works well, it will be a huge achievement on the part of Apple is that it does not require extra utensils or devices to work, something very far from the technology of Neuralink, Musk’s company that has managed to get a man to control a screen with his gaze, but they have had to insert a chip in his head, And as we told you last week, bugs have already been reported.
Users can navigate through elements in an app and use Dwell Control to activate each element, accessing additional features such as physical buttons, swipes, and other gestures with their eyes only.
Related:
Comments