That is not to be denied Apple‘s Siri Digital chatbot Didn’t really hold a place of honor this year WWDC 2025 Keynote. Apple called it and repeated that it took longer than it expected that everyone would bring the Siri who promised it a year ago, and said that the entire Apple integration would arrive “in the coming year”.
Apple has since confirmed that this means 2026. That means that we will not see the kind of deep integration that Siri would have had used what it knew about you and your iOS-Running iphone to become a better digital companion in 2025. iOS 26Use app intentions to understand what is happening on the screen and take action based on your behalf.
I have my theories about the reason for the delay, most of which revolve around the tension between delivering a rich AI experience and the core principles of Apple with regard to privacy. They often look like cross purposes. However, this is guesswork. Only Apple can tell us exactly what is going on – and now they have.
Together with Tom’s Guide Global Editor-in-chief Spoonauer, I sat down shortly after the keynote with senior vice president of software engineering Craig Federerghi and Apple Global VP from Greg Joswiak for a broad podcast discussion about almost anything that Apple revealed during the 90-minute.
We started with Federerghi to ask what Apple has delivered with regard to Apple Intelligence, as well as the status of Siri, and what iPhone users can expect this year or the following. Federerghi was surprisingly transparent and offered a window on Apple’s strategic thinking when it comes to Apple Intelligence, Siri and AI.
Far from nothing
Federerghi started with us by leading everything that Apple has delivered with Apple Intelligence so far, and to be honest, it is a considerable amount
“We were very focused on creating a wide platform for truly integrated personal experiences in the operating system.” Remembered Federerghi, referring to the original Apple Intelligence announcement on WWDC 2024.
At the time, Apple demonstrated writing aids, summaries, reports, film memories, semantic search in the photo library and cleaning up for photos. It yielded all those functions, but even when Apple was building those tools, Federghi told us that “we, based on that basis of large language models on device, count private cloud as the basis for even more intelligence, [and] Semantic indexing on device to pick up keep knowledge, build a better Siri. “
Over reliability?
A year ago, Apple’s confidence in its ability to build such a Siri led to a platform that could handle more conversation context, misplaced, type to Siri and a considerably redesigned onion. Again, everything that Apple has delivered.
‘We also talked about it […] Things such as being able to evoke a wider range of actions on your device by app intentions that are orchestrated by Siri to do it more things, “added Federerghi.” We have also spoken about the possibility of using personal knowledge from that semantic index, so if you ask for things like: “What is that podcast, who sent me ‘Joz’ that we could find it, whether it was in your messages or in your e -mail, and the statements, and then perhaps even acting with the help of those app intentions.
This is a well -known history. Apple over area and no more than delivered, and failed to deliver a vague promised end-of-year Apple Intelligence Siri update in 2024 and Grant through spring 2025 that it would not be ready soon. Why it happened, it has been a bit of a mystery so far. Apple is not the habit of demonstrating technology or products that is not sure that it can deliver on schedule.
However, Federerghi explained in some detail where things went wrong and how Apple is progressing from here.
“We discovered that when we developed this function, we really had two phases, two versions of the ultimate architecture that we would make,” he explained. “Version one that we had here when we came close to the conference, and at the time had a high confidence that we could deliver it. We thought by December, and if not, we thought in the spring, until we had announced it as part WWDC. Because we knew that the world wanted a very complete picture of: “What does Apple think about the implications of Apple Intelligence and where is it going?” “
A story about two architectures
While Apple was working on a V1 of the Siri architecture, it also worked on what Federerghi V2 called, “a deeper end-to-end architecture that we knew in the end was what we wanted to make to achieve a full set of possibilities that we wanted for Siri.”
What everyone saw during WWDC 2024 were videos of that V1 architecture, and that was the basis for the work that started seriously after the WWDC 2024 disclosure, in preparation for the entire Apple Intelligence Siri launch.
“We are set for months, so it works better and better in more app intentions, better and better for searching for searching,” added Federerghi. “But in essence we discovered that the limitations of the V1 architecture did not bring us to the quality level that we knew our customers were needed and expected. We realized that V1 architecture, you could push and push and push in more time, but if we tried to meet our customer, and we would have been in the state, and we would do it. Moving to the V2 architecture.
“As soon as we realized that, and that was in the spring, we let the world know that we could not turn it off, and we would continue to work on really shifting to the new architecture and releasing something.”
We realized that […] If we try to push that in the state it would be, it would not meet our customer expectations or Apple standards, and that we had to move to the V2 architecture.
Craig Federerghi, Apple
That switch, and what Apple learned along the way, meant that Apple would not make the same mistake again and promised a new Siri for a date that could not guarantee it. Instead of. Apple will not “precede a date,” explains Federerghi, “until we have in-house, the V2 architecture that not only delivers in a form that we can demonstrate for you all …”
He then joked that, although he could actually ‘demonstrate’ a working V2 model, he would not do it. Then he added, more seriously: “We have, you know, the V2 architecture, of course, but we are not yet ready that it delivers at the quality level that I think it makes a great Apple function, and so we do not announce the date for when that happens.
I asked Federighi if he was talking about a wholesaler of Siri, but Federerghi did not make me that idea.
“Ik zou moeten zeggen dat de V2-architectuur dat niet is, het was geen ster-over. De V1-architectuur was een soort de helft van de V2-architectuur, en nu breiden we het over, een soort van een pure architectuur die zich uitstrekt over de hele Siri-ervaring. Dus we zijn zeer veel beter en veel beter om te bouwen wat we hebben opgebouwd wat we hebben opgebouwd wat we hebben opgebouwd wat we hebben opgebouwd wat we hebben built up what we are much better and so much better, what we are much better and so much better and are so much better and are so much better and we are much better and so we are much better and that of the win. “
Another AI strategy
Some can consider the failure of Apple to consider the entire Siri on its original schedule as a strategic stumbling. But Apple’s approach to AI and product is also completely different from that of Openi Or Google Gemini. It’s not about a single product or a powerful chatbot. Siri is not necessarily the center that we had all introduced.
Federerghi does not dispute that “AI is this transformational technology […] Everything that grows from this architecture will have decades of years of years in industry and economy, and just like the internet, just like mobility, and it will touch Apple’s products and it will touch experiences that are far beyond Apple products. “
Apple wants to be clearly part of this revolution, but on its conditions and in a way that most benefits its users, while of course they protect their privacy. Siri, however, was never the endgame, as Federitehi explained.
AI is this transformational technology […] And it will touch Apple’s products and it will touch experiences that are far beyond Apple products. “
Craig Federerghi, Apple
“When we started with Apple Intelligence, we were very clear: this was not about building a chatbot. So apparently, when some of these Siri possibilities I mentioned did not come, people were something like:” What happened Apple? I thought you would give us your chatbot. That was never the goal, and it doesn’t remain our primary goal. “
So what is the goal? I think it can be pretty clear from the WWDC 2025 -Keynote. Apple plans to integrate Apple Intelligence on all its platforms. Instead of going to a unique app such as chatgpt for your AI needs, Apple puts it everywhere in a certain way. It is done, Federerghi explains, “in a way you meet where you are, not that you go to a chater experience to get things done.”
Apple understands the allure of conversations. “I know that many people think it’s a really powerful way to collect their thoughts, brainstorm […] So, certainly, these are great things, “says Federitehi.” Are they the most important thing for Apple to develop? Well, the time will learn where we go there, but that is not the most important thing that we wanted to do right now. “
See below for the full interview with Federitehi and Joswiak.

Maybe you like it too
- Advertisement -