OnePlus 15 vs Galaxy S25 Ultra: What a 500-Shot Shootout Reveals About Smartphone Camera Engineering

Based on reporting and testing by Adam Doud for ZDNET. Original article: ZDNET. Analysis and perspective expanded for a technical audience.

Article illustration 1

Smartphone camera comparisons are usually content marketing in disguise: a few staged portraits, a skyline at dusk, some 10x zoom crops, and a verdict that politely flatters both vendors.

A recent ZDNET field test at Six Flags cuts through some of that choreography. Over 500 photos shot across harsh daylight, colored stage lighting, smoke-filled scare zones, and AI-driven super-zoom expose something more interesting than “which camera is better.” They expose how Samsung and OnePlus are now engineering fundamentally different imaging stacks, and what that means for computational photography in 2025.

This isn’t a beauty pageant. It’s a case study in trade-offs: sensor binning vs consistency, denoise vs detail, multi-telephoto arrays vs algorithmic reach — and the increasingly thin line between photography and hallucination.

Article illustration 2

Hardware: Two Blueprints, Two Philosophies

On paper, Samsung’s Galaxy S25 Ultra looks like the obvious powerhouse:

  • Main: 200MP, f/1.7, 1/1.3"
  • Ultrawide: 50MP, 120° FoV
  • Telephoto 1: 10MP, 3x optical
  • Telephoto 2: 50MP, 5x optical
  • Front: 12MP

Samsung is all-in on optionality: multiple dedicated tele lenses, a very high-resolution primary, and a tuning history that favors detail and contrast for social-ready output.

By contrast, the OnePlus 15 goes for structural coherence:

  • Rear triple: 50MP main (f/1.8, 1/1.56"), 50MP ultrawide (116°), 50MP 3.5x periscope telephoto
  • Front: 32MP

Three 50MP sensors create a more uniform data surface for OnePlus’ pipeline: similar pixel pitches and behavior allow more predictable denoise, sharpening, and color across focal lengths. The Hasselblad branding is gone, but OnePlus keeps its XPAN-style mode and, more importantly, has matured its own color and processing identity.

From an engineering standpoint, this is the core divergence:

  • Samsung: Maximal sensor and lens diversity, heavier fusion logic, and aggressive scene optimization.
  • OnePlus: Cohesive sensor strategy, simpler mapping across modules, and tighter control over how AI and ISP stages behave from 0.6x to 120x.

The Six Flags shoot shows how those decisions surface in real-world edge cases.

Daylight Main Camera: Data Density vs Pleasant Rendering

In bright scenes, the S25 Ultra’s 200MP sensor behaves exactly as architects intend: oversampled detail, finer texture rendition, and more faithful surfaces.

In the ZDNET comparison, brick, grime, and wall textures are more accurately resolved on Samsung; OnePlus smooths, aiming for a cleaner, more flattering image.

For practitioners building imaging pipelines, this is familiar:

  • Samsung leans into detail-preserving, micro-contrast-heavy tuning that survives cropping and pixel peeping.
  • OnePlus prioritizes consumer-friendly aesthetics, allowing its NR and tone mapping to erase some micro-texture.

Neither is wrong. But if your workflow includes reframing, computational zoom, or downstream CV tasks (OCR on signage, object recognition, AR overlays), Samsung’s detail advantage is meaningful signal.

Night Main Camera: Sharpening, Smoke, and Readability

Low light on a Halloween-themed midway introduces non-determinism: haze, smoke, shifting light.

In one flagship shot, OnePlus takes the win. While Samsung produces a slightly blurry capture under the same conditions, OnePlus leans harder into sharpening and contrast, delivering:

  • A more readable warning sign
  • A more presentable image on typical phone-size viewing

Is it natural? Less so. Is it effective? In context, yes.

For developers, this illustrates how vendor tuning encodes assumptions:

  • OnePlus optimizes for perceived clarity at mobile resolutions, even if it means oversharpening.
  • Samsung occasionally underplays micro-contrast in motion-heavy or haze-heavy scenes.

If you’re designing camera experiences for users who primarily share in apps, OnePlus’ bias toward clarity-over-purity is instructive.

Ultrawide: The Secondary Sensor Stress Test

Auxiliary cameras are where many flagships quietly cut corners.

  • Daytime ultrawide: OnePlus outperforms with crisper foliage and structures; Samsung’s sample appears softer and lower-res at the edges.
  • Night ultrawide: Samsung edges ahead on defined detail in stairs and railings, while OnePlus smooths shadows more pleasantly but less accurately.

Technically, this tracks with a broader industry pattern: OEMs still treat ultrawide as a second-class citizen in tuning. The lesson: teams that maintain cohesive ISP behavior across lenses (as OnePlus attempts with its triple-50MP setup) can gain real-world advantages, especially for users relying on ultrawide for travel, architecture, and documentation.

10x Telephoto: When Non-Native Zoom Wins

At 10x — not native for the OnePlus 15’s 3.5x periscope — the expectation is a Samsung win, thanks to its multi-tele stack and long-standing AI zoom branding.

Instead, OnePlus produces the sharper, more dimensional image, while Samsung’s output looks overprocessed and flat.

This is the most interesting finding for imaging engineers:

  • OnePlus’ upscaling, fusion, and sharpening chain is strong enough to beat Samsung’s better-theoretic optics at this step.
  • Samsung’s telephoto pipeline, tuned for drama at higher zoom factors, can drift into heavy-handed artifact territory.

In other words, computational photography is now good enough that smart, conservative algorithms on a single strong tele module can eclipse brute-force multi-lens arrays when tuning is off.

100x–120x AI Zoom: Where Photography Becomes Interpretation

Extreme zoom is where physics politely leaves the chat and AI steps in.

  • Samsung: up to 100x
  • OnePlus: up to 120x

At ~500+ feet, capturing a Six Flags Fright Fest flag and a Foghorn Leghorn statue, both phones rely on aggressive:

  • Multi-frame stacking
  • Upscaling and deblurring
  • Edge-aware sharpening
  • Prior-based reconstruction (learned patterns for text, edges, shapes)

In ZDNET’s samples, OnePlus delivers a startlingly clean frame:

  • Readable typography, with subtle letterform details intact
  • Clear subject separation on the statue

Samsung’s result, by comparison, degrades into a noisier, more garbled reconstruction.

From a technical lens, this is a statement win for OnePlus’ AI zoom stack. But it should also make every engineer slightly uncomfortable.

At 100x–120x, we are no longer strictly capturing reality. We are synthesizing a plausible version of it.

For anyone working on:

  • Forensic imaging
  • Safety/security camera integrations
  • On-device visual search or OCR

…the OnePlus/Samsung behavior is a live reminder: super-zoom outputs should never be treated as ground truth. They’re the output of priors, training data, and tone curves — not just photons.

Why This Matters Beyond Spec Sheets

For developers, camera and ISP engineers, and product leaders, this shootout is more than brand drama.

Key takeaways:

  1. Cohesive sensor design can rival (or beat) spec-maximalism. OnePlus’ triple-50MP setup simplifies processing assumptions and yields competitive consistency.
  2. AI zoom has crossed a threshold. OnePlus’ 120x performance demonstrates how far reconstruction has come — and how urgently we need clearer disclosure and guardrails when these images influence trust-sensitive workflows.
  3. View-context-aware tuning wins. Both vendors optimize for how photos are actually consumed (phone screens, social feeds), not 1:1 crops on 32-inch monitors — but they do so in divergent ways that downstream apps must understand.
  4. Secondary lenses are the new differentiator. Many real-world wins and losses now happen on ultrawide and telephoto, not just the hero sensor. Ecosystem developers building camera-first apps need to test across these modules, not only the primary.

Where Flagship Cameras Go From Here

The Six Flags side-by-side doesn’t crown an absolute winner; it sketches two distinct futures for mobile imaging:

  • Samsung as the champion of maximal hardware optionality and bold, contrasty output — powerful when tuned right, unforgiving when it’s not.
  • OnePlus as the advocate for coherent sensors, disciplined computational photography, and shockingly strong extreme zoom that borders on generative reconstruction.

For the people building the next generation of camera stacks, vision models, AR toolkits, and imaging silicon, this duel is the real signal: the battle is no longer about who has more megapixels, but who can orchestrate photons, priors, and perception into something users trust — and experts can still verify.

Source attribution: All empirical comparisons, sample descriptions, and device configurations are derived from Adam Doud’s original review and camera shootout for ZDNET, independently tested and edited under ZDNET’s published editorial standards.