Skip to main content
  1. Home
  2. Photography
  3. Evergreens

Camera sensor crop factor: What it means and why you should care (or not)

Add as a preferred source on Google

If you’re in the market for an interchangeable lens camera, you’ve probably heard the term “crop factor” thrown about in reference to different formats. Basically, crop factor refers to how the field-of-view of a given lens changes with different sensor sizes. The term itself comes from the fact that a smaller sensor sees a smaller portion of the scene, and thereby “crops” the image relative to a larger sensor.

Sensors come in many sizes, and, in general, physical size has a greater bearing on image quality than the number of megapixels. The short explanation for this is that a larger sensor gathers more light, and light is what photography is all about. This is true even with film: Larger film formats result in higher image quality, which is why Quentin Tarantino made such a fuss about The Hateful Eight being shot on 70-millimeter film.

Recommended Videos

However, larger sensors necessitate larger lenses and thus lead to heavier and bulkier camera systems, overall. As smaller sensors have improved in image quality in recent years, many people simply don’t need the quality bump that comes with upgrading to a larger format. You need to consider weight and price into the equation, which often gives an advantage to smaller formats.

Common sensor formats

Among consumer cameras, crop factor is always in reference to “full frame,” a sensor size equal to a frame of 35-millimeter film. So the crop factor is the ratio of the image sensor size to 35mm film. This means that your Nikon D850, Canon EOS R, Sony A7 III, or other full-frame camera has a crop factor of 1X. Micro Four Thirds (MFT) cameras, like those from Olympus and Panasonic, have a crop factor of 2X, while APS-C sensors have a crop factor of 1.5X (unless it’s a Canon, and then it’s 1.6X, because Canon had to go and be different). Oh, and just so you’re aware, Nikon refers to its full-frame cameras as FX and its APS-C cameras as DX. Confused yet?

Crop-Factor-Simulation
Visualization of the approximate frame size of full-frame, APS-C, and Micro Four Thirds. Daven Mathies/Digital Trends
Daven Mathies/Digital Trends

What this means is that if you have a 50-millimeter lens and put it on an APS-C body, it will offer an equivalent field of view of a 75-millimeter lens on a full-frame camera (50 × 1.5 = 75). Likewise, on an MFT body, it will look like a 100-millimeter lens, as if being “zoomed in” by a factor of two.

Speaking of lenses, it is possible to use a lens from a larger format on a smaller sensor, either through direct compatibility (e.g., Nikon FX to Nikon DX) or with an adapter (e.g., Canon full-frame to Sony APS-C). Going the other way around, while technically possible in some cases, is not generally recommend, as a lens made for a smaller format won’t project an image circle big enough to cover a larger sensor (so you shouldn’t use an MFT lens on an APS-C or full-frame camera).

Crop factor and depth of field

Crop factor can also be applied to aperture, and illustrates how sensor size affects both light sensitivity and depth of field. A wider aperture lets in more light and creates a shallower depth of field. A full-frame lens with an aperture of f/2.8 will have an equivalent aperture of (roughly) f/4 on an APS-C camera, or f/5.6 on an MFT camera. This means it is possible to achieve a shallower depth of field (think blurrier backgrounds) with a larger sensor, all else held equal.

This does not mean that if you put a Nikon 50mm f/1.4 FX lens on a DX body that the camera will indicate the lens is f/2 — it won’t. This is simply a way to understand how sensor size impacts depth of field and sensitivity.

While the ideas of equivalent focal length and equivalent aperture can be confusing, for the regular consumer, the most important thing to remember here is simply this: What you see is what you get. All you really have to do is slap a lens onto your camera and look through the viewfinder or LCD. You don’t have to perform any math in your head if you don’t want to.

That said, it is helpful to keep these concepts in mind when shopping for a new camera or lens, especially if you currently shoot one format and are considering switching to another. Knowing how lenses will behave differently from what you’re used to will help guide you to the right purchasing decision.

Daven Mathies
Daven is a contributing writer to the photography section. He has been with Digital Trends since 2016 and has been writing…
Google releases big v4.0 update for its popular Snapseed editing app on Android
Electronics, Phone, Mobile Phone

After years of sitting on its hands, Google appears to have remembered it owns one of the best photo editing apps on mobile. Snapseed 4.0 is now rolling out to Android, bringing the platform up to speed after a stretch of iOS exclusivity that left Android users watching from the sidelines.

The story starts last June, when Google quietly broke Snapseed out of its long dormancy with a significant 3.0 update for iPhone. It was a surprise move that suggested the company was serious about the app again. Google then confirmed at the start of this year that Android wouldn't be left behind for long, and true to that word, the Play Store listing has now been updated to reflect version 4.0 — skipping straight past 3.0 for Android users and landing both platforms on the same version simultaneously.

Read more
Google Photos gets new editing tools that are all about subtle touch-ups
Google Photos just made your camera roll feel like it came with a makeup artist included, and the results are refreshingly understated.
Google Photos Touch Up feature in action.

Whether it is dark circles from a late night of work, a blemish that showed up uninvited, or something similar that could use additional brightness, Google Photos now has you covered.

Google has officially rolled out a new Touch Up suite inside its Photos app editor, integrating face retouching tools directly into the app for the first time. Previously, such adjustments were only available inside Google’s Camera app at the time of capture. 

Read more
Adobe Firefly AI will let you edit in creative software by just talking your way through it
Adobe's new AI Assistant can now run your entire creative workflow. Yes, all of it.
Adobe Firefly logo on dark background

Adobe has quietly been building something big inside Firefly, its all-in-one creative AI studio. And today, the company is ready to show it off.

Meet Firefly AI Assistant, a conversational tool that lets you describe what you want to create and then handles the execution across Adobe's entire app ecosystem, including Photoshop, Premiere, Lightroom, Express, and Illustrator. 

Read more