Most AI agents live in a text-only world, limited to chat interfaces. But what if they could hear and speak? In this video, we'll show you how to transform any LangGraph agent into a voice-enabled assistant. Using task mAIstro (our AI-powered task management app) as an example, we'll demonstrate:
How to add voice input using OpenAI's Whisper
How to enable natural speech output with ElevenLabs
A general workflow you can apply to any LangGraph agent
Code repo:
https://github.com/langchain-ai/task_...
For more on the details of task mAIstro and LangGraph deployment, see LangChain Academy:
https://academy.langchain.com/courses...https://github.com/langchain-ai/langc...https://github.com/langchain-ai/langc...