Siri might be getting a makeover—but not in the way most expected. As Apple heads into WWDC25, all eyes are on how it plans to rescue its struggling voice assistant. But instead of just improving how Siri talks, Apple may be shifting its focus to how Siri sees. Backed by the Vision Pro team and new camera-based technologies, Apple appears to be steering its AI strategy toward visual intelligence, turning everyday devices into context-aware assistants. Could this be the breakthrough Siri needs?
WWDC25: What to Expect
Apple’s annual developer conference, WWDC25, kicks off on June 9th, and all eyes are on Siri. The company is under intense pressure to:
- Fix Siri’s outdated performance
- Deliver delayed AI upgrades promised for the iPhone 16
- Show working demos of smarter assistant behavior
But here’s the twist: Apple isn’t just trying to make Siri more conversational—it’s trying to make her more visually aware.
Siri Delays & Apple’s AI Lawsuit
Apple had promised major Siri improvements:
- Personalized help based on app activity
- Smart context switching between apps
- Integration across Messages, Calendar, and more
These features were promoted in iPhone 16 marketing, but failed to launch—leading to a class action lawsuit filed in San Jose, CA, for false advertising.
“A smarter Siri was a big part of the sales pitch for the iPhone 16.”
Leadership Shakeup: Vision Pro Team Takes Over Siri
In response to delays and public criticism, Apple handed Siri’s future to Mike Rockwell, the VP who led the Vision Pro launch.
Before | After |
---|---|
Siri Team Leadership | AI & iOS division |
Now Reporting To | Vision Pro team (led by Rockwell) |
Why Rockwell? Because the Vision Pro isn’t just a headset—it’s Apple’s most camera-advanced AI product ever, with 12 cameras analyzing the user’s environment in real-time.
“The Vision Pro isn’t just AR—it’s an AI system with eyes.”
Apple’s Future AI Strategy: Cameras Everywhere
Apple’s new direction in AI seems clear: use cameras to give Siri context and awareness.
Here’s what Apple may be working on:
1. Apple Watch with Camera
- Goal: Make it a wearable AI assistant
- Feature: Camera input to understand surroundings
- Reported by: Mark Gurman (Bloomberg)
2. AirPods with Infrared Sensors
- Idea: Not a photo camera, but a sensor for gestures, spatial awareness, and sound positioning
- Could support: Gesture detection, adaptive audio, hands-free commands
3. Apple Smart Doorbell
- Uses: Facial recognition to unlock doors
- Timeline: Rumored launch as early as late 2025
4. Apple Maps + Camera AI Training
- Vehicles now use captured imagery to train Apple’s generative AI tools
- Supports features like: Image Playground, object clean-up, scene understanding
Key Takeaways
Topic | Summary |
---|---|
WWDC25 Focus | iOS 19 and real-world demos of Siri’s next-gen AI |
Major Delay | Siri upgrades missed iPhone 16 release window |
Legal Trouble | Apple faces a lawsuit for false advertising over AI claims |
New Direction | Vision Pro’s AI team now leads Siri development |
Core Strategy | Cameras will power the next leap in Siri’s intelligence |
Final Thoughts
Apple’s AI game plan is expanding beyond voice—and into vision. Siri may not just hear you; it may see and interpret the world around you. But here’s the big question:
Will users embrace more camera-powered devices…
…if Siri still struggles with basic tasks?
Let us know what you’re hoping to see at WWDC25. Do you have high expectations for iOS 19 and Apple Intelligence? Drop your thoughts below—and join us next week for one more thing in the world of Apple.