"accelerate everyday tasks" in apps without intents?

From https://www.apple.com/newsroom/2024/06/introducing-apple-intelligence-for-iphone-ipad-and-mac/:

Powered by Apple Intelligence, Siri becomes more deeply integrated into the system experience. With richer language-understanding capabilities, Siri is more natural, more contextually relevant, and more personal, with the ability to simplify and accelerate everyday tasks.

From https://developer.apple.com/apple-intelligence/:

Siri is more natural, more personal, and more deeply integrated into the system. Apple Intelligence provides Siri with enhanced action capabilities, and developers can take advantage of pre-defined and pre-trained App Intents across a range of domains to not only give Siri the ability to take actions in your app, but to make your app’s actions more discoverable in places like Spotlight, the Shortcuts app, Control Center, and more. SiriKit adopters will benefit from Siri’s enhanced conversational capabilities with no additional work. And with App Entities, Siri can understand content from your app and provide users with information from your app from anywhere in the system.

Based on this, as well as the video at https://developer.apple.com/videos/play/wwdc2024/10133/ , my understanding is that in order for Siri to be able to execute tasks in applications, those applications must implement the Siri Intents API.

Can someone at Apple please clarify: will it be possible for Siri or some other aspect of Apple Intelligence / Core ML / Create ML to take actions in applications which do not support these APIs (e.g. web apps, Citrix apps, legacy apps)?

Thank you!

It was mentioned in State of the Union that Siri automatically can take action on menu items (00:13 timecode) and has access to any text on screen that is presented through a standard text system and can take action with that text.

Thank you theskyfloor!

Here is a link to Platforms State of the Union 2024: https://developer.apple.com/videos/play/wwdc2024/102/

From 13:05:

"Siri will be able to invoke any item from your app's menus. Second: Siri will be able to access text displayed in any app that uses our standard text systems. This will allow users to directly reference and act on text visible on screen."

I believe this means it won't work with virtual apps like those on Citrix or Tailscale. Does anyone know if there is planned support for virtual apps?

As a workaround, https://github.com/OpenAdaptAI/OpenAdapt is a Mac-compatible open source implementation of similar functionality that suppotrts virtual apps.

(Although this particular feature is shown in iOS only, I believe this applies to MacOS as well since the screenshots in the video are from both MacOS and iOS.)

"accelerate everyday tasks" in apps without intents?
 
 
Q