Flutter's GenUI Package Overhauled: New Architecture Gives Developers Direct Control Over AI Interactions
Breaking: GenUI and A2UI Receive Major Updates
The Flutter team has released significant updates to both the GenUI package and the underlying A2UI protocol, fundamentally changing how developers build generative user interfaces. The latest version of package:genui (v0.9.0) adopts a "Prompt First" philosophy and decouples the architecture, giving developers direct control over Large Language Model (LLM) connections. This marks a shift from the previous "Structured Output First" approach where A2UI messages were streamed through structured output APIs. The updates are driven by adoption of A2UI protocol v0.9.
Background
GenUI, short for Generative UI, is a user experience pattern where an AI agent not only generates content but also decides how that content should be displayed and made interactive. For Flutter developers, implementing GenUI previously relied on the A2UI protocol—an open standard for agent-client collaboration on UI composition and state. The Flutter team built the genui package to connect agents with a catalog of widgets and present them to users.
The latest updates streamline this process. According to a Flutter engineering lead, These changes empower developers to customize every aspect of the AI interaction loop without being constrained by the framework's abstractions.
What This Means
For Flutter developers, the decoupling means greater flexibility. The old ContentGenerator class has been removed. In its place, three distinct layers now handle UI state, messaging, and chat management:
- Engine (
SurfaceController): manages state and rendering of the UI. - Transport (
A2uiTransportAdapter): streams messages between agent and renderer. - Facade (
Conversation): provides a high-level API for chat state management.
Previously, provider-specific wrapper packages (e.g., genui_dartantic, genui_google_generative_ai, genui_firebase_ai) were required. With this release, those packages are no longer needed. As one developer noted, You can now plug in any LLM provider directly, tweak generation settings, and add custom functions without going through the framework's API.
The shift to a Prompt First approach means agents now include blocks of JSON as text in their responses, rather than relying on structured output APIs.
Migration Guide
Developers migrating from v0.7.0 to v0.9.0 must follow specific steps, including dependency cleanup and wiring up new chat loops. The most significant code change: instead of passing a ContentGenerator to a SurfaceController, your app now directly sets up an LLM connection and passes messages through a TransportAdapter. The old approach looked like:
final generator = FirebaseAiContentGenerator(...);
Now, you control chat history, retry logic, and error handling. A full migration guide is available on the Flutter documentation site.
Industry Impact
This architectural change positions Flutter as a more flexible platform for AI-driven UIs. By removing the rigid ContentGenerator abstraction, the framework aligns with developer preferences for fine-grained control. Early adopters report that the new architecture reduces boilerplate and enables faster iteration on conversational AI features.
Related Articles
- Go Team Launches 2025 Developer Survey: Feedback to Shape Future of Language
- Safeguarding Configuration Rollouts at Scale: A Practical Guide to Canarying and Progressive Deployments
- Go 1.26 Unleashes Completely Rewritten 'Go Fix' Tool to Modernize Codebases
- Kubernetes v1.36 Milestone: Declarative Validation Becomes Fully Stable
- JavaScript's Date Nightmare Finally Gets a Fix: Temporal API Promises to End Time-Based Bugs
- Unlocking the Power of AI-Assisted Programming: Key Insights and Frameworks
- Why Great AI Engineers Need Product Management Skills
- Go 1.25 Introduces Flight Recorder for Real-Time Debugging of Long-Running Services