- Apple is developing an “onscreen awareness” feature to allow Siri to understand and interact with the content currently displayed on your screen
- Apple will also provide APIs to developers for integrating onscreen awareness into their third-party apps and is currently testing ChatGPT integration, allowing Siri to answer questions based on screenshots
- While not available in the iOS 18.2 beta, “onscreen awareness” may arrive in time for iOS 18.4 in 2025
Among the digital assistants, Siri has fared fairly well (certainly compared to Cortana, the ill-fated assistant from rival Microsoft), and now Apple is working on making Siri even smarter by giving it a better sense of what you’re looking at on your screen, calling it ‘onscreen awareness.’
Apple has gone into detail about the development of this feature in an Apple Developer Documentation page, which also notes that it’s due to be included in various upcoming Apple operating system (OS) beta releases for testing.
Apple originally showcased onscreen intelligence in June 2024 and this is a pretty solid indication that it’s still in development.
The core idea of onscreen awareness is pretty straightforward – if you’re looking at items on your screen, say a document or a browser with a page open, and you have a question about something you’re looking at, you can ask Siri (equipped with Apple Intelligence). Siri should then be able to respond to your question with relevant information or perform an action asked of it, like sending content to a supported third-party app.
If it works as intended (and that’s a big ‘if’), it will result in a smarter Siri that won’t need you to describe what you want it to do as…
Read full post on Tech Radar
Discover more from Technical Master - Gadgets Reviews, Guides and Gaming News
Subscribe to get the latest posts sent to your email.