Cameras have always been a major focus for the Pro iPhones, and that remains true this year. The iPhone 17 Pro and Pro Max retain a triple camera system, made up of a 48-megapixel main camera, 48-megapixel ultrawide camera, and 48-megapixel telephoto camera.
That doesn’t mean there aren’t new camera tricks, though. In fact, there are a few. For a few years now, Apple has used a sensor-crop technology to deliver so-called “optical-quality” or “lossless” zoom on the main camera. This means that it can crop to the middle of the main sensor to achieve 2x zoom without actually introducing any digital zoom. Now, the company is bringing that tech to another sensor — the telephoto camera. The telephoto camera itself is new — it’s a tetraprism telephoto camera with 4x optical zoom, and when you apply a 2x sensor crop to 4x glass, you get 8x “optical quality” zoom. When you add the also-optical-quality 28mm (1.2x) and 35mm (1.5x) focal lengths, along with the macro mode on the ultrawide camera, you can now use macro, 0.5x, 1x, 1.2x, 1.5x, 2x, 4x, and 8x modes to achieve what Apple calls “eight lenses” in one phone.
Now, using the term “lenses” is perhaps not fully appropriate. Lenses imply actual glass, and you don’t have eight cameras on the iPhone 17 Pro. But the concept of increasing the level of flexibility without adding actual cameras is a neat one — and one that I appreciate. If it means Apple adds additional pipelines to apply sensor crops at other zoom levels, including those in between the cameras, I’m perfectly happy for Apple to claim the iPhone has the equivalent of more “lenses” on its devices. That’s especially true if it’s able to do so without adding more actual cameras — arguably, the triple camera iPhone 17 Pro is now more versatile than the quad-camera Samsung Galaxy S25 Ultra.
Regardless of Apple’s marketing claims, the phone performed incredibly well in all situations. The versatility of the camera really made a difference here. In scenarios that were easy for the phone to handle, like well-lit environments and low levels of zoom, it captured bright, vibrant, and detailed images. But it’s how it performed in more niche scenarios that made it clear there was a difference between this phone and the base iPhone 17.
Still, even with the new telephoto camera, the Galaxy S25 Ultra seemed to capture slightly more detail in extremely zoomed-in photos. The colors did look a little more natural on the iPhone, but edges started to get a bit blurry, while the Galaxy device was able to maintain some detail. The device did perform better than the Pixel 10 Pro, however, even with the Pixel device’s zoomed AI processing.
Combining the challenges of low light and zoom, the iPhone performed better than either the Galaxy or the Pixel device. Even at higher levels of zoom, like 20x, the iPhone 17 Pro captured more detail than the other devices while retaining relatively bright colors.
There are improvements on the front of the phone as well. All 2026 iPhone models have a new 18-megapixel square sensor for the front-facing camera, and it’s combined with Apple’s Center Stage tech to make it highly flexible. Because it’s a square sensor, rotating the phone itself to capture vertical or horizontal shots doesn’t change the orientation. Instead, you can tap on a button to switch between vertical and horizontal modes, so you can hold the phone however is most comfortable to you. The device will also automatically use Center Stage to apply a tighter crop to selfies with just one or two people versus those with more — though you can manually control this if, for example, you want to get more of your background in the shot.
Apple has added support for ProRes RAW on the iPhone 17 Pro, which captures more sensor information for professional workflows. This isn’t something I use or really tested, but it may enhance your particular workflow.