Head over to our on-demand library to view sessions from VB Transform 2023. Register Here
In an interview with VentureBeat following yesterday’s Amazon announcement introducing the new large language model (LLM) powering its Alexa device, the company’s generative AI leader, Rohit Prasad, said Alexa is now a “super agent.”
Alexa’s LLM is now integrated with “thousands and thousands” of devices and services, said Prasad, who joined Amazon in 2014 as director of machine learning on Alexa and now is SVP and chief scientist, artificial general intelligence. He told VentureBeat at Amazon’s new second headquarters in Arlington, Virginia that the model connects to the largest set of APIs he could think of. That means that now Alexa is “grounded” in real-time knowledge that is useful and connected directly to users, he explained.
Amazon’s new Alexa as ‘momentous’ as the original
Though Amazon has been working with AI in Alexa and its other devices for years, the debut of what he called a “massive” state-of the art large language model, built with a decoder-only architecture, feels “as momentous as when we brought Alexa to life the first time [in 2014],” he said. But, he reiterated what Amazon devices chief Dave Limp said at the announcement event: “Our Northstar has been the same, we want that personal AI that can that you can interact with, naturally, that can do anything on your behalf.”
He emphasized that while the excitement around generative AI is “great — you want this kind of excitement in AI” — Amazon’s road to conversational dominance is quite different than chatbots like OpenAI’s ChatGPT or Anthropic’s Claude.
Event
VB Transform 2023 On-Demand
Did you miss a session from VB Transform 2023? Register to access the on-demand library for all of our featured sessions.
“We are not a chatbot in a browser. You’re interacting with …it’s actually doing very useful things in the real world. There’s utility for creativity for brainstorming on the desktop and the browser, but that’s not [our] path.”
The multimodal, multilingual and multifaceted model is “is hugely complex,” he said, combining computer vision, natural language processing and pattern recognition. Yet, he added, it is the “complex being made simple” for users and developers.
Prasad refutes criticisms that Alexa was ‘dumb’
Amazon’s Alexa has been criticized in recent years for a general lack of usefulness — Microsoft CEO Satya Nadella reportedly said in March that Alexa and its AI assistant ilk were “all dumb as a rock.” And in recent months Prasad has been forced to defend accusations that Amazon had missed out on the generative AI boom.
“I refute the comment that [Alexa was] dumb,” said Prasad. “We have hundreds of millions of customers using it, more than half a billion devices have been sold and interactions with Alexa have grown by 30%.” As a technologist who “knows the guts of the large language models,” he said that there is a big difference between tools like ChatGPT and devices like Alexa, which does “real things in the real world.”
That requires some of the powers of large language models, but making it even better for the home by integrating a personal context — like what do you like to listen to? What do you like to watch? Who’s your favorite team? Are you vegetarian?
“All that makes these LLMs far more useful and much smarter,” he said. “For example, if you said to Alexa, ‘it’s hot in here,’ if Alexa was not integrated with your personal context, it might say to go to the beach. But if you’re in a room, it knows that you have a connected thermostat, and should lower the temperature — so it’s not just going to generate cool responses or tell you things, but actually does things right.”
Addressing questions about privacy
Prasad addressed questions about data privacy — concerns that have dogged Alexa in the past, as well as other home devices like Roomba (Amazon signed an agreement last year to acquire Roomba’s owner, iRobot, but the deal has not yet closed).
“I don’t think you would put an AI in your home if you didn’t trust it,” said Prasad. “Privacy and trust is paramount.” Any collection of data, he emphasized, has to focus on customer permission.
“We’ve been very transparent from day one, what’s been collected [and] you can go and look at what has been collected,” he said. “And you can always go and check in your privacy dashboard of what is on or what is not on by default as well. That principle never changes.”
People should not forget that Alexa is an AI
But while Prasad is excited about Alexa’s new, more human-like and seamless capabilities, and recognizes that people almost take Alexa for granted, he emphasized that he never wants people to forget that Alexa — the device — is an AI.
“I want to be very transparent that Alexa is an AI,” he said. “I don’t know what will happen in terms of how it’s being adopted in homes in the future. But I can at least say that if there’s any point where people forget it’s an AI, then Alexa should remind people that it is an AI.”
VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.