The Personal AI Travel Companion is a multi-agent system designed to deliver hyper-personalized travel assistance at every stage of the journey — from initial inspiration to long-term travel memory. Unlike conventional travel apps that treat each booking as an isolated transaction, this system builds a continuous, evolving model of the traveler's preferences, behaviors, and aspirations.
At its core, an orchestrating AI coordinates eleven specialized agents across four distinct stages: pre-trip planning and booking, real-time in-trip guidance, post-trip curation and reflection, and long-term companion intelligence. Each stage feeds the next, creating a feedback loop that makes the system smarter and more attuned with every journey.
This architecture document provides a complete specification of the system's agent topology, data flows, technical dependencies, and exhibition implementation scope. It is intended to serve as both an academic reference and a technical blueprint for the working demonstration.
The system is organized around a central orchestrating intelligence — the Core Personal AI — which coordinates four specialized stage clusters. Data and context flow continuously between stages, and a feedback loop from Stage 4 back to Stage 1 ensures the system evolves over time.
Transforms raw travel intent into a confirmed, personalized itinerary with all reservations secured.
Acts as an always-on local guide — sensing context, resolving disruptions, and bridging language barriers in real time.
Transforms raw trip data into curated memories and actionable insights that feed the next journey.
Grows alongside the traveler over years — optimizing loyalty, safeguarding wellness, and anticipating the next adventure.
The Core Personal AI is not a passive router — it is the persistent intelligence that gives the system its character. It maintains a unified, evolving model of the traveler, mediates between agents, and ensures every interaction feels cohesive and intentional rather than stitched together from disconnected services.
The exhibition demo presents a working proof-of-concept for the three highest-impact agents, demonstrating the core interaction loop — from preference intake through personalized itinerary generation — with voice interaction and live visual output. The remaining agents are represented architecturally and described in context.
The exhibition presents the system as an immersive, large-format installation. A visitor approaches a floor-standing screen — the AI companion appears and initiates a natural conversation about their dream destination. Within minutes, a personalized itinerary materializes in real time, with a live map, day-by-day activities, and cultural insights. The full agent architecture is visible as a live background layer, making the system's intelligence transparent and visceral.
Visitor approaches the screen. The AI avatar activates and greets them with a spoken welcome. Keyboard available as fallback input.
The AI asks: "Where have you always wanted to go?" — Preference profiling begins, extracting travel style and interests in natural dialogue.
A map materializes. Day-by-day activities appear in real time as the Itinerary Agent thinks — visitors watch the plan build itself.
The AI demonstrates the Language Agent — offering a phrase in the destination's language, with pronunciation and cultural context.
The background agent map animates — showing which agents activated, the data flows, and how the system made its decisions.