This is the wrong abstraction and a hidden platform play
Good
-
Combination of text/voice based interaction + traditional ui shown contextually is 100% the way apps will work in the future (just go re-watch iron man)
-
Coupling UI with agentic behaviours allows deep hyper customisation of UX for each user. This will increase user engagement and retention.
Bad
-
This is the wrong abstraction, it is compromised at birth. .... "in order to make it simple enough for the AI to understand it, we need to simplify it down to a 1 dimensional list of components".... this will not produce engaging UX, this is a forms engine, this feels like work to your users.
-
Handing UI design over to the Agent is the wrong separation of concerns. Agents are good at understanding user input, developers and designers are good at UI design. Giving more and more to the model means that we need to train and pay for larger and larger, more and more expensive models. Enough already.
-
Vendor lock-in and walled gardens - in order for the agent to understand this new UI declarative semantics, it needs to be infused into the model, either during training or as a fine tuning. This means you need a model that understands it specifically. This is going to be google models accessible via Google apis. We barely have agreement across models on how tool calling should work, let alone this peculiar abstraction. In order for this to become universal, you would need a huge groundswell of support (like Anthropics MCP) across the industry and I don't see it.