If you’ve read my previous thoughts on iPhones here on FactsTimes and sister site Tom’s Guide , you’ll know that I have some pretty strong opinions about Apple’s smartphones.
Since switching from Android to iPhone in late 2021, I haven’t gone back to the platform Google built, despite trying some of the best Android phones. I was drawn to the ease of iOS; I love the titanium construction, I find Ceramic Shield glass to be a minor game-changer, I like the action button, and the cameras rarely let me down on iPhones.
But for once, I’m having second thoughts.
The one that comes to mind is the camera control button. In some ways, it’s a cool new feature that makes good use of haptics. In other ways, it’s redundant and underpowered.
I’ve been testing the iPhone 16 Pro Max for a few weeks now, and when it comes to taking a photo, I try to use Camera Control as much as possible. Being 37 and a millennial, I still enjoy taking photos in landscape mode on my phone, so having a physical button where my finger naturally sits is nice for snapping a photo without messing up the framing by tapping the screen or trying to hit the action button—I have it set to activate the “flashlight” anyway, which is surprisingly useful.
I also like being able to cycle through zoom ranges with a swipe on the camera control without having to tap on little icons. The exposure control is pretty cool, though switching between the functions that the Camera Control can control doesn’t feel entirely intuitive yet, and I often lose track of the precise design of a scene in my taps.
So yeah, camera controls are interesting. But…
Did anyone really ask for them? It feels like a feature in the interest of Apple’s mobile executives to have something new to talk about at Apple’s September event. It’s just a “nice to have” feature, but it’s hardly a game changer for phone photography.
But maybe I’ll get used to it over time. But the biggest issue is the lack of AI tools at launch for Camera Control. Apple is actively promoting its Camera Control AI features that can be used to intelligently identify where the cameras are pointed and provide all sorts of information. That hasn’t happened yet, as the rollout will happen after launch when Apple Intelligence is fully available; there’s a beta version, but I don’t want to try it on my headphones.
I’ve yet to understand that. Sure, other phone makers have touted AI features that will arrive after the release of their phones, which may be limited to certain regions at first, but they’re at least launching with some of the promised AI suites. The iPhone 16 series launched without Apple Intelligence features.
This isn’t what I expected from Apple, a company known for not embracing new technology until it’s refined and ready for polished primetime. So it’s baffling to me to see smartphones launch without the latest generation of smarts. But it’s also the main reason I’m divided on camera controls; if it had Google Lens-like capabilities in a hardware form factor at launch, I’d be much more positive about Camera Control.
Of course, Apple’s inclusion of such a camera button will undoubtedly lead to other phone makers following suit. I just hope they don’t skimp on features when their phones launch.
As for camera controls in the here and now, I’ll keep an open mind and keep using them; I’m keeping my fingers crossed that they’ll become seriously useful once they get the prescribed dose of AI intelligence.