Deliver hyper-personalized viewer experiences with an agentic AI movie assistant using Amazon Bedrock AgentCore and Amazon Nova Sonic 2.0
TL;DR
AWS demonstrates two practical use cases for an AI-powered movie assistant that learns user preferences through natural conversation and delivers personalized recommendations.
Key Points
- The system combines the Strands Agents SDK, Amazon Bedrock AgentCore, and the voice model Amazon Nova Sonic 2.0 into a full agentic stack.
- Model Context Protocol (MCP) serves as the communication layer between components – an open standard for connecting tools to LLMs.
- The goal: a personal entertainment concierge that understands context and acts proactively, not just a smarter search bar.
Nauti's Take
AWS is cleverly using this as a showcase for three of its own products – Bedrock AgentCore, Nova Sonic, and the Strands SDK – but the technical core is genuinely interesting regardless. MCP as a connector layer between agents and external tools is visibly cementing itself as the de-facto standard, and that is a healthy development for the whole ecosystem.
A voice-controlled movie concierge sounds like a gimmick, but it is a realistic example of how agents could replace UI layers in the medium term. The real question is whether users will actually want to share the level of contextual data such a system requires with their streaming provider.