WHEN: Today, Tuesday, May 14, 2024
WHERE: CNBC’s “Squawk Box”
Following is the unofficial transcript of a CNBC exclusive interview with Alphabet CEO Sundar Pichai on CNBC’s “Closing Bell: Overtime” (M-F, 4PM-5PM ET) today, Tuesday, May 14. Following are links to video on CNBC.com: https://www.cnbc.com/video/2024/05/14/alphabet-ceo-sundar-pichai-we-can-do-google-search-a-lot-better-with-generative-ai.html and https://www.cnbc.com/video/2024/05/14/alphabet-ceo-search-uses-geminis-intelligence-and-grounds-it-with-what-it-knows-about-the-world.html.
All references must be sourced to CNBC.
DEIRDRE BOSA: Morgan, thank you very much. And, Sundar, thank you so much for making the time after that fantastic keynote.
SUNDAR PICHAI: So great to be here. Thank you.
BOSA: So this is pretty much the biggest overhaul of search that we have seen in, what, two decades? This new experience will be available to over a billion users by the end of the year. Why did you wait until now?
PICHAI: You know, in some ways, we have been evolving it continuously. The good thing about search is, people comfortably use it. They take it for granted. We have been answering questions for a while. But with generative AI, we can do it a lot better. We have been testing it for a while. And we now feel it’s the right moment to roll it out broadly.
BOSA: And feedback has been good, right, from the users who have been trying it?
PICHAI: Yes, user engagement has been positive. The feedback has been great. I think it makes the product much better, and so it’s a great direction.
BOSA: What about advertisers? Because this will change the business model. In some cases, you’re going to get links from a traditional search. In some cases, you’re going to get a generative AI answer, which would move those links lower down on the page. Are they ready for this moment? What are you telling them about their ability to reach your users?
PICHAI: You know, the great thing is, users still value commercial information. Our ads work based on intent and quality and relevant at the right time. We have been able to test that in the context of AI Overviews and it’s working well, as we expected it to. So, I think it’ll be a smooth transition, and that’s what we are seeing.
BOSA: I think I heard Liz Reid say that it’s leading to more searches, but the generative AI, or AI Overview, as you’re calling it, is it leading to more or less clicks?
PICHAI: In general, we find it’s both overall increasing usage, and when we look at it year on year, we have been able to grow traffic to the ecosystem. So, we have compared to most of the players, we are prioritizing approaches which will generate traffic as well, so we are working hard on that.
BOSA: Does it change the business model? How are you thinking about that?
PICHAI: No, I think, about a year ago, people had questions on whether this would cost too much to serve. We have brought down costs 80 percent. I don’t think that will be a concern. I think the way we have been at work and the way we are rolling it out, I feel like we are set up in a pretty good way, and we can build on from here.
BOSA: Right. Let’s talk about costs. You brought it up. SemiAnalysis estimates that a single chat with a ChatGPT could cost up to 1,000 times as much as a simple search. As you said, you have brought that cost down. But bringing out AI Overviews to everyone in America, to over a billion users by the end of the year, that has to raise the cost on your side.
PICHAI: You know, it’s still maybe more expensive than a traditional query, but not by much. You know, we’re in a — just in the last year, we have made our models maybe overall about 80 times more efficient. And so, you know, this is what Google was set up for. For 25 years, we have built our own infrastructure from the ground up. And it’s an area I feel super comfortable that we can actually do it well.
BOSA: Is that because you’re using your own in-house custom TPUs, or do you still use GPUs for the AI Overviews, the search, searches?
PICHAI: I mean, we are a close partner of Nvidia’s, and we definitely use both GPUs, as well as our own in-house hardware. But it’s more about — it’s the entire end to end, what we call as an AI Hypercomputer, how we put it all together and run it super efficiently.
BOSA: Right, so not material costs that go up from this form of searching?
PICHAI: That’s right.
BOSA: Critics have said for some time that search has become more cluttered over the years. With AI Overviews, you’re kind of adding more on to that. And competitors, like Perplexity, to name one, have sprung up that have rethought the entire user interface, the whole entire user experience, to a lot of fanfare. Why not use this moment to completely overhaul the search experience instead of adding new layers on top?
PICHAI: Oh in some ways, that’s what we are doing. When you say AI Overviews, we’re kind of organizing it for you. It has links in it, so it’s not like something that just goes on top alone. Today, we showed good examples where it almost organizes the page for you. And so I actually view it as we are simplifying the experience over time. I mean, our feedback has been very positive as we test it. People actually find the experience getting better. So, I think it’s an exciting direction.
BOSA: Would simplifying it the most be just putting it straight to Gemini? I mean, especially as users get used to other chatbots and going directly to them, why not just kind of go all in with the Gemini, which was such a huge focus of the keynote in the last year?
PICHAI: You know, what search does is unique, in the sense that it takes the intelligence of Gemini, and we ground it with what search knows about the world. You know, what people really value is accurate, trustworthy information. And I think that’s part of — part of — even in this moment, I think people find Google Search very valuable. And they also care about what’s out there on the Web. So, sometimes, they’re looking for a quick answer. Sometimes, they actually want to go out and learn more. So, getting that balance right is also what search does well, I think.
BOSA: And now you’re letting technology make that judgment whether to get links directly to other Web sites or give a generative AI answer. How are you explaining that, again, to, like, your advertisers and merchants?
PICHAI: You know, I mean, they — they see it in their data, right? There are advertisers who are part of this AI Overviews as we are rolling it out. I think they will see it in their performance. Every time we have these transitions, people are a bit uncertain. But I think we have done this from desktop to mobile. You know, when local and social content became much more available, we integrated it seamlessly in search. We are doing the same with AI and we have been doing it for a decade. So, I view it as we will be able to build upon it.
BOSA: I want to get to Project Astra because that was certainly one of the most exciting parts of the keynote’s technology that we haven’t really seen before. We saw a little bit of it yesterday from OpenAI and its new ChatGPT-4o, but it feels like broadly we’re moving out of the era of chatbots and into the era of an AI agent. How do you make sure that Google wins that sort of next phase of generative AI that users are going to be increasingly using?
PICHAI: I think you started seeing examples today across our keynote of what we think of as agentic capabilities. Project Astra itself is one, right, to be able to process the real world in front of you and constantly process it and answer it intelligently. We are building, you know, you can go to Gemini and ask it to plan a trip. In search, we announced multistep reasoning. You can write very, very complex queries. Behind the scenes, we are breaking it into multiple parts and composing the answer for you. So these are all agentic directions. Very early days. We’re going to be able to do a lot more. I think that’s what makes this moment one of the most exciting I have seen in my life.
BOSA: And the demo captured a lot of people’s imaginations. Are those products, those features available right now? When are they available?
PICHAI: You know, Project Astra is something we are working to bring to Gemini. But we will it sometime this year. It will be quality-driven, just like with Google Lens. We are going to test it out, give it to more people, but then roll it out widely. That’s what we did with search. And so we have — we know how to do it and scale it up.
BOSA: Is that fast enough, when ChatGPT shows a demo or OpenAI shows a demo a day before I/O, and now those some of those features are being used right now? Can you guys move faster?
PICHAI: I don’t think they have shipped the demo to their users yet too. I don’t think it’s available in the product. So, I think all of us are, we are working at the cutting-edge technology and bringing it as fast to our products is possible. I think it’s good to be in that moment. But we have a clear sense of how to approach it, and we will get it right.
BOSA: Right. You said before that Google’s competitive advantage in GenAI is the quality of your data, not just the quantity of it. There was a report that OpenAI trained GPT for — on millions of hours of YouTube videos. Would you sue OpenAI for violating your terms?
PICHAI: Look, I think it’s a question for them to answer. I don’t have anything to add. We do have clear terms of service. And so I think, normally, in these things, we engage with companies and make sure they understand our terms of service. And we will sort it out.
BOSA: Are you doing anything to determine if they broke your terms?
PICHAI: We have processes to do that. I’m not exactly familiar.
BOSA: OK. Back to the Astra demo, the experience was better through glasses than through the phone. I think that was obvious. Everyone could see. What phone was that and what kind — or what kind of glasses were those? And what kind of leap in hardware do we need to really integrate AI agents into our lives?
PICHAI: You know, what we’re showcasing is, we build Gemini to be multimodal, because we see use cases like that. Project Astra shines when you have a form factor like glasses.
BOSA: Yes.
PICHAI: So, we’re working on prototypes. But through Android, we have always had plans to work on A.R. with multiple partners. And so, over time, they will bring products based on it as well.
BOSA: A lot of anticipation over how Apple is going to integrate OpenAI, sorry generative AI into its phones. What are you doing to make sure that you’re in pole position in generative AI on the iPhone, like you have been in search on the iPhone?
PICHAI: Throughout both, we have had a great partnership with Apple for the years, over the years. We have focused on delivering great experiences for the Apple ecosystem. It is something we take very seriously. And I’m confident we have many ways to make sure our products are accessible. We see that today. AI Overviews have been a popular feature on iOS when we have tested. And so we will continue, including Gemini. We will continue working to bring that there.
BOSA: We last spoke at I/O about two years ago. Do you expect to be in the same position at I/O in 2025?
PICHAI: Look, I feel we are at an inflection point. Things seem to be happening faster. So, by 2025, I think we will make a lot of progress.
BOSA: A year from now, what do you hope to accomplish?
PICHAI: Things like Project Astra would be something you take for granted when you use Google, and it’ll be able to see the world around you.
BOSA: A wide rollout of Project Astra?
PICHAI: Oh, by this time at I/O? Yes, absolutely.
BOSA: Across the U.S. and even more? Would you be at the same space as you are, or same stage as you are in search?
PICHAI: Yes.
BOSA: Rolling it out to over a billion users?
PICHAI: I mean, I, obviously, we will be quality-driven, but that’s the kind of aspiration we are working towards.
BOSA: OK, well, Sundar, thank you so much for taking the time.
PICHAI: Thank you.
BOSA: Appreciate it.