When Jakob Nielsen recently highlighted a fundamental shift from command-based to intent-based interfaces, it resonated deeply with my work in AI innovation at Booking.com.
For decades, UX has focused on explicit commands – clicking buttons, selecting options, typing commands – to interact with interfaces. But Nielsen envisions a future where users can simply express their intent naturally, in their own words.
That in itself is transformational enough to be called a new paradigm, but I want to take this one step further: What I’m also seeing is the advent of interfaces that proactively understand and respond to user intent without the need for users to explicitly express it at all.
This future isn’t a distant possibility – it’s emerging in digital experiences right now, challenging our assumptions and reshaping our approach to interface design.
What fascinates me is the speed of this transformation. For the past few years, we’ve celebrated interfaces as “intuitive” if users can figure them out easily. Hierarchy, layout, affordances – all that can be perfected so that users quickly find the content and controls they need to easily navigate through the website or app. But what if they didn’t have to? What if the interface adapted dynamically so user didn’t have to look for controls and navigate through flows?
Imagine you didn’t have to find your way through a building anymore. Instead, the building rearranged itself around you, exactly when and how you need it, based on what cues you provide about the place you want to go to.
At Booking.com, this transformation is actively being explored:
- Today, travelers still translate their dreams into search parameters using filters, sorters and so on.
- Soon, they can express their intent in natural language, simply sharing the vision of their perfect stay, just as Jakob Nielsen described.
- Looking further ahead, the platform will anticipate unspoken needs by intelligently connecting subtle cues about preferences and travel styles, while also drawing from historical patterns and broader contextual data. All in real-time, at the individual user level.
Not only will this make interfaces more convenient – it will fundamentally change how humans interact with technology. We’re moving from a world where users adapt to interfaces, to one where interfaces adapt to users.
It’s Not Just Personalization
The evolution of personalization illustrates how profoundly our approach to interface design is changing.
Traditional personalization, often exemplified by Amazon’s product recommendations or Netflix’s content suggestions, primarily focuses on content selection. Super valuable, for sure, but this represents only the surface of what’s possible. The real opportunity lies in personalizing the entire interaction model – not just what users see, but how they engage with the interface itself.
The next frontier involves adapting the entire experience – transforming how users explore and interact with content based on their unique behaviors, preferences, and real-time context.
This is where the integration of AI enables a fundamentally deeper level of personalization that transforms the entire interaction model.
Dynamic Interface Generation
The power of such adaptive interfaces would not come from a single AI system, but from the orchestrated interaction of multiple specialized systems working in concert.
This orchestration creates experiences that would be impossible with any single system:
- Natural Language Processing
Interprets user intent by converting ambiguous human expressions into structured understanding - Intent models
Analyze user interactions and engagement to detect patterns and friction points - Predictive models
Anticipate user needs based on historical patterns and current context - Recommendation systems
Suggest not just content but interface configurations - Feedback systems
Continuously learn from user responses
Let’s walk through a concrete example of how these systems could work together in a travel booking scenario. Imagine a user types “Mountain retreat with good internet for a month“:
- A Natural Language Processing system breaks this down into structured components:
- Explicit requirements
- Mountains
- Reliable internet
- One-month stay
- Inferred requirements
- Remote work friendly setup
- Long-term stay, but specific dates not critical
- Kitchen facilities or breakfast included or restaurants nearby
- Natural setting, likely seeking quiet
- Explicit requirements
- Intent models analyze this against the user’s past interactions and behavior patterns:
- Previous interactions show preference for visual exploration over detailed lists
- Past bookings indicate preference for comfort over budget considerations
- Past sessions indicate proficiency with advanced filtering options
- Interaction patterns suggest they make decisions based on location first
- The interface then adapts its structure and functionality:
- Switches to a map-centric view, knowing this user thinks spatially
- Surfaces advanced work-related filters prominently (internet speed, workspace photos)
- Adjusts property cards to emphasize images of workspaces and surroundings
- Generates comparison mode specifically for evaluating certain amenities
- Modifies the booking flow to highlight monthly rate options and extended-stay policies
- As the user explores, the interface continues to adapt:
- When comparing properties, it automatically offers to a side-by-side view optimized for key decision factors
- Property details are expanded or collapsed based on user attention patterns
- The review section dynamically highlight remote work experiences
- Photo galleries adapt their layout and content focus based on user engagement
- The system learns and adjusts in real-time:
- Since the user spends time looking at images, the interface offers a more immersive mode
- When they start comparing amenities and prices, it suggest a comparison-focused layout
- As they show interest in specific parts of the location, the interface highlights local amenities and transportation options
While we’re not yet at the level of fully dynamic, intent-driven interfaces, the building blocks are emerging.
Let’s take this yet one level further, and bring A/B testing into the mix. Traditionally, A/B testing is used to compare static design variants. In an AI-driven interface, we could test subtle adaptations at the individual user level, learning which adjustments best serve their goals. An interface might learn that a certain users prefer visual browsing, and automatically adjust its presentation accordingly for other users that show a similar pattern.
This isn’t about creating an autonomous system that hyper-optimizes for business metrics. Rather, it’s about developing interfaces that deeply understand and support what users are trying to accomplish.
Bringing this back to a travel example: If a user is planning a family vacation, the system should recognize the complexity of family travel needs, from budget considerations to planning activities for different age groups, and the interface should support a planning process where one or two people make decisions for a group.
From Theory to Reality
Building these systems presents a fascinating set of interconnected technical challenges that push the boundaries of what’s possible in interface design. The complexity comes not just from individual technical hurdles, but from the need to solve multiple critical challenges simultaneously:
Performance & Architecture
- Real-time interface adaptation requires sophisticated processing and response systems
- Predictive models becomes essential to maintain flows that are smooth
- Strategies must balance responsiveness with customization
Consistency & Coherence
- Dynamic elements must maintain visual and functional consistency
- Design systems must support variation while preserving identity
- Accessibility considerations become more nuanced with adaptive interfaces
Testing & Validation
- Traditional testing methods must evolve for dynamic interfaces
- Quality assurance requires new approaches for adaptive elements
- Success metrics must consider increasingly non-linear flows
The Evolution of the Design Process
Creating truly adaptive interfaces requires a fundamental transformation in our approach to design. This shift manifests across multiple dimensions:
The transition from designing fixed interfaces to creating systems of potential interfaces represents a fundamental shift in how design teams operate and collaborate. This evolution has profound implications for team structure, tools, and methodologies.
The Role of Designers
- From interface design to system architecture
- From linear user flows to non-linear scenarios and adaptation rules
- From static components to dynamic systems
The Role of Design Leadership
- Balancing innovation with grounded focus in user needs
- Managing uncertainty and experimentation
- Developing new success metrics
- Developing teams with hybrid skill sets
- Creating new processes for exploring adaptive design
New Design Thinking
- Possibility spaces rather than fixed solutions
- Adaptation patterns rather than static states
- System behaviors rather than interface elements
- Learning rules rather than fixed logic
An Evolution of Skills
- Understanding of ML capabilities and limitations
- Prompt design and AI literacy
- Probability-based design decision making
- Systems thinking and adaptive design skills
Complex Design Decisions
- Balancing user agency with interface adaptation
- Determining appropriate thresholds for subtle versus explicit adaptations
- Establishing clear boundaries for interface capability
- Communicating adaptive capabilities to users
Team Structure & Collaboration
- Designers need to work closely with ML engineers and ML product managers to understand possibilities
- User researchers develop new methodologies for studying adaptive experiences
- Design systems (and their teams) need to evolve from maintaining static components to creating adaptive systems
- Product managers move from feature development to evolving system learning capabilities
Tools & Methodologies
- Prototyping tools need to support state generation rather than state design
- Simulation becomes increasingly important to test real-time, dynamic and non-linear experiences
- Version control systems need to be established for dynamic interface states
- Documentation systems need to capture adaptation rules and patterns
The Evolution of Trust and Literacy
As these systems mature, we’re witnessing the emergence of new patterns in how users understand and interact with adaptive interfaces. The relationship between users and adaptive interfaces requires new mental models trust frameworks.
Adaptation Literacy
- Understanding how to influence adaptation patterns
- Learning to interpret and predict adaptive systems
- Maintaining control while benefiting from adaptation
Trust Development
- Creating transparency about how behavior is interpreted and predicted
- Developing clear methods for user control and override
- Establishing new conventions for communicating system capabilities and limitations
A Personal Reflection
In my role leading AI innovation projects with global impact, I’m privileged to participate in this transformation. But the most compelling aspect to me isn’t the technology itself, but how it enables more natural, human-centered interactions.
That promise of intent-based interfaces isn’t about removing human decision-making – it’s about enriching it. Imagine planning a complex trip without fighting against dropdown menus and calendar widgets, instead engaging in a fluid exploration of possibilities that evolves with your understanding of what you want. That’s the future we’re building toward: one where technology feels less like a tool to be mastered and more like a collaborative partner in achieving human goals.
As we push these boundaries, our north star remains clear: every advancement must serve genuine human needs, not just technological possibilities. This is what makes our current moment so exciting – we’re not just making interfaces smarter, we’re making them more deeply aligned with how humans naturally think, work, and explore.
Comments are closed.