‘In my lifetime, it’s one of the biggest transformations I’ve seen’. A Samsung executive talks Galaxy AI – and why the UI is the AI
When I first met Samsung’s Patrick Chomet a year ago, we were talking about the company’s new foldables, updated watches, and earbuds. We weren’t discussing AI, and the concept of “Galaxy AI” didn’t exist at the time, at least not to me. Now, as we sit in Paris to talk at Samsung Galaxy Unpacked 2024, AI weaves through our discussion like a thin but stretchy thread, weaving new capabilities on device and in the cloud that will redefine how we use this next, emerging generation of Samsung Galaxy products.
Chomet, who as VP and Head of Customer Experience chooses both product and third-party features to bring to Galaxy devices, admitted that the pace and scale of change is unprecedented. “In my lifetime [this is] one of the biggest transformations I’ve ever seen. I’ve been through the transformation from feature phone to smartphone. I think this one is actually bigger.”
Samsung, along with partner Google, fully jumped into the generative AI race when it introduced Galaxy AI with the Samsung Galaxy S24 line in January. Now, it’s bringing many of those features, including conversational translation, photo editing and image generation, to its foldable line (Galaxy Z Fold 6, Galaxy Z Flip 6), its new earbuds (Galaxy Buds 3 and Galaxy Buds 3 Pro) and the all-new Samsung Galaxy Ring (which Chomet says was developed in part because people, himself included, don’t like to wear their smartwatches while they sleep).
In the interim between these two launches, Apple finally entered the fray with Apple Intelligence, Apple’s rebranded suite of ecosystem-level generative AI features. I wondered how Samsung viewed Apple Intelligence. Chomet chuckled quietly and told me that his PR team typically warns him about talking about competitors; but without naming the company, he offered a pretty damning analysis of Apple’s big move.
“We have started [the] intelligent S24, and… I don’t like the name, but we have to have a name, so we’re calling it Galaxy AI, which will represent all these intelligence features at the various touchpoints. And then [a] competitor has a different name. We say we are hybrid, and I think they end up with [something] similar. First of all, ours is real. We’ve deployed this stuff on over 100 million devices and will be on over 200 million by the end of the year. So you can see what it is and you can try it. I honestly don’t know of anything else in the market that’s real.”
Chomet was clearly referring to the fact that Apple Intelligence is not only tied to platforms launching later this year, like iOS 18, but also the fact that Apple has indicated that some of the biggest changes, like on-device image generation, may not come until later this year or even next year.
Closer to home, Chomet isn’t just focusing on the Galaxy AI features of individual products; he sees this as an ecosystem.
“The transformation to AI is about making all services intelligent, but also about better understanding the context and intent of the customer, so that’s very deep. We want to implement that at scale across all devices, including smartphones and others,” Chomet told me, later adding that “we want to do it at an ecosystem level.”
That ecosystem play is muddied somewhat by the existence of Bixby, reportedly among the group of early-stage digital assistants like Siri and Alexa that are all due for significant upgrades. But Galaxy AI is not Bixby, and vice versa. I asked Chomet where Bixby fits into this picture.
The Bixby Question
Chomet told me it’s important to remember that all of these early assistants, including Bixby, were based on powerful natural language processing (NLP). “What’s happening now with a big language, with the technology, is making the NLP pretty much obsolete, and LLM [large language model] technology is much better at understanding human intentions, language, gestures, voice. So that’s a big technological change.”
Bixby is a service for Samsung that’s primarily concerned with device control, and while it may not be as well-known as Siri or Alexa, if you enable it on your Galaxy phone you might be surprised at all the things it can do for you. Still, I couldn’t help but wonder whether there’s still a place for Bixby alongside the more visible and critically acclaimed Galaxy AI.
“It’s a service that many of our customers use, but not everyone. And we will actually continue to develop and improve Bixby with this LLM technology and others, to make the experience magical across all Samsung devices,” Chomet revealed.
In other words, Bixby will get a lot smarter, though no timeline has been set for that. Still, Bixby will remain fundamentally different, if not separate, from Galaxy AI.
“I try [make a distinction] “Between the structure of the device, which is intelligent, and then we’re going to run all kinds of services on the device, from Bixby to music and other things, all of which will also be intelligent,” Chomet explains.
Building the AI Ecosystem
That fabric will flow across multiple Galaxy devices, adding an intelligence that will eventually recognize context and intent, and it’s a strategy that could be amplified by the Galaxy ecosystem. Samsung has been pushing the idea of this ecosystem for years, but because the company doesn’t have end-to-end control the way Apple does (from the devices to the platforms to the silicon), it hasn’t always been visible in the same way.
“We see a healthy development of the Galaxy ecosystem. We still have a lot to do, that’s fair,” Chomet admitted, but he also explained how owning more Galaxy devices, especially with the addition of Galaxy AI, will bring new benefits. “The key word is convenience,” he said.
Chomet pointed out how pairing the new Galaxy Buds 3 Pro with a Galaxy Z Fold 6 works with a single click, and that the buds know how to pass calls or read emails or notifications from the phone to the buds. The combination of devices makes them work as one, he told me; “By the way, this is not science fiction, this is real.”
During our conversation, Chomet described a fundamental change in the way Samsung looks at platforms.
“What we say is that the UI is AI. AI is the UI. The user interface is becoming intelligent… every service, ours or third-party, is powered by AI and includes generative AI. So that will continue. That’s kind of a revolution, but not the deepest, from my perspective. The deepest revolution for us is the UX; the user interface is intelligent, starting with gestures, text, speech, audio, and so on. So that’s the user interface. The AI challenge is one that we’re very, very deeply involved in, and of course that also plays out at the OS level with our partnerships.”
I’ve already started playing around with some of the Galaxy AI features across multiple devices, and one thing I noticed is that they’re not all easy to find. Some, like conversation translation, are under Settings. Chomet didn’t disagree.
“So, my kind of constant struggle is we have so many things and they’re like, how would the customer know? Oh, you have to go into the Settings menu… but no one goes there, certainly no one in my family ever uses that menu. So the beauty of it would be that you don’t. The goal is that you never have to go into the Settings menu,” he says, before adding cryptically, “which we’ll never achieve, but that’s our goal.”
Samsung has made improvements on that front. For example, I noticed that one of the Galaxy AI features first introduced with the S24 line, the ability to add slow-motion to videos recorded at normal speed, has been improved on the Galaxy Z Fold 6, with Samsung making it easier for users to save those clips. Chomet told me that another change on the front is suggestions for generative AI photo editing. “Now we’ll suggest suggestions instead of you having to search for them.”
The special partnership
Where Samsung seems to lack consistency is in the way it implements these various Galaxy AI features. Perhaps that’s because it relies so heavily on partnerships, particularly with Google, with which Chomet says his company has a “very special” partnership. I was curious to hear Chomet’s take on Circle to Search, a feature that Google seems to be claiming as its own. Chomet was less certain. “I don’t know who invented it,” he told me. “Search is Google, but the physical interface is a collaborative effort.”
The origin story is a little fuzzier than I expected. Chomet told me that Samsung had talked to Google a few years ago, and Google was talking about new gestures on the screen. “You have to integrate hardware and software to make it work, right?” he said. “So I don’t know who invented that, but we actually worked with them, so ultimately it’s powered by Google Search.”
As Chomet sees it, Samsung’s needs are sometimes the mother of Google’s invention. He pointed to the Samsung Galaxy Z Fold 6 that sat before us. “We launched the foldable first. So we have something here called Screen Continuity, which is an Android feature. So if you have a YouTube video on your home screen and you open the foldable, it automatically and seamlessly continues. So that feature, you’ll say, ‘Well, that’s definitely an Android feature,’ but this feature wouldn’t exist if I didn’t ask for it, because I need it for the foldable.”
Like Apple, Samsung’s approach to AI is hybrid, both on and off-device. However, access to off-device services like Circle to Search, and some generative features like Sketch to Image, still happens off-device.
Chomet told me it’s better to run things on the device itself for low latency and privacy, but there’s more to it than that.
“You don’t need connectivity, so it’s just faster,” he explained. “So performance, if we do it on-device, we have more context of the user data on the device, which is private. So we can keep all the user context or the user data on the device and personalize the experience, like the next action, configuration, what should it be, based on your context that’s private and secure on the device. So the more we keep it on-device, the more we can do things that are personalized but still private. So performance, personalization, and privacy […] that is our direction.”
In the meantime, Chomet told me, users can decide for themselves whether they want these generative AI systems to share their data with the cloud.
As the transition from hybrid to, perhaps one day, everything-on-device continues, Samsung is working to make as much of Galaxy AI available to its billion-strong user base as possible.
“The only limitation is quality,” he said. “We’re going as fast as we can with the right level of quality to deliver to the end user, and it’s really just getting started.”