The notion that 72 dpi is the resolution for all output to web has been wrong for years. , and The interface in your favorite graphic software leads you to believe that it does exist… and it actually does, but there’s more to it than that.
Twenty-five years ago, a 13″ CRT monitor has a resolution of roughly 72 dots (pixels) for every inch on the screen. This was a value that graphic artists could use to estimate the actual dimensions of an image, just like picas for printing or points for font dimensions. Your modern monitor most likely delivers at least 130 pixels per inch, or much higher if you are using a high resolution monitor like an Apple Retina displays (they display over 300 pixels per inch).
Unfortunately, despite the change in technology, the 72 dpi myth is still alive and well. Many people are confused about the relationship between dots/pixels per inch and the actual size of an image in pixel as a result.
Let’s get back to basics: a photo taken on your digital camera has NO resolution – only PIXELS.
Picture your camera sensor as a miniature LiteBright game that has 3000×2000 holes to create a 6 megapixel image.
Once in Photoshop, the image still only contains 6 megapixel pixels — NOTHING ELSE! It does not have a “pixels per inch” number, because the image does not have a physical length or width at this point. It’s just a bunch of pixels sequenced in a particular order.
Design software dissplay a resolution per inch, as if the image was to be physically printed on paper. Before that translation to a physical printout is mage, but the image itself only 6 megapixels. When you or your design software decides how these pixels will translate to a physical dimention, then the dpi is calculated. Printing this file at 12×8″ would generate a 240dpi resoluton. Doubling the printed printed dimension would the lower the resolution per inch to 120dpi. The quantity of pixels in the orginal file from your camera doesn’t change, but the way they are spread out over a physical space does.
Pixels vs dpi are simply two different units of measure. Another analogy to understand the concept better: the HD movie standard only requires 1920×1080 pixels from your camera, and will always only use 1920×1080 pixels. If you play that movie on 19″, 42″ or 80″ monitor… the pixel count will always be the same. The larger tv doesn’t add data to the image, and the smaller tv doesn’t strip it out. Think of it as a compression or extention of the same amount of data over different size spaces.
Test it yourself
1. Open a picture in Photoshop (3000x2000pixels as a reference)
2. Photoshop will present a resolution of 240 (or anything else)
3. Change the dpi value to 72
4. Once again you assume to be right, but make sure that you are looking at the pixel count.
5. Make sure to type the original pixel value back and that you are looking at the PIXEL units – if need be, type back the orginal values (3000×2000) and the file size will be once again the same as before (34.3MB for this example)
6. By default your favorite software may resample the image, but if the pixels are the same the size of the image itself in pixels, will remain the same.
Urban legends are strong, don’t worry your cell phone will not make your gas tank explode, a box of tissue paper cannot kill you in a car accident and Walt Disney was never cryogenically frozen.