The Relationships Between Pixels, Points, Picas, Dots And Inches

A common question on Yahoo! Answers revolves around the relationships between inches, pixels, points, dots and picas.

The short answer is, a pixel — the basic dot in a TV screen — is the same thing as a point, the printer’s smallest measure. That one fact dictates all the relationships in online design, from why it is online images are always set to 72 dpi (dots per inch) to how to convert back and forth between inches and pixels.

A pixel is a dot is a point. Remember that and everything else falls into place: Points, pixels and dots are all the same thing.

How many pixels in an inch?

In printers’ measures, there are 12 points to a pica and 6 picas in an inch. So, 12 x 6 = 72; there are 72 points in an inch. And since points and pixels are the same thing, there are 72 pixels in an inch, too.

So, if you want something on screen to be 4 inches wide: 4 x 72 = 288. An element you want to have take up 4 inches horizontally on the screen should be made 288 pixels wide.

Scale is what matters

Many other resources will tell you there are 96 pixels in an inch. And that is often true, as far as it goes.

How many pixels truly constitute one inch of screen real estate is a product of the monitor itself. A pixel is a real thing: a picture element (“pixel” is an abbreviation of that term). It is one tiny dot that the screen can make red, green or blue. When viewed from afar, each pixel blends in with all the other pixels, making a color picture.

In most CRT (cathode-ray tube) monitors, 72 pixels literally measure one inch; that is, if you line 72 CRT pixels side-by-side, they will literally measure one inch. In most LCD screens, 96 pixels measure one inch. Plasma screens also tend to have about 96 physical pixels per inch, although some plasma screens have many, many more; due to the way plasma TVs work, their physical pixels per inch count is somewhat arbitrary.

So sure, you could use 96 pixels per inch and feel pretty comfortable about the measurement being physically true.

But what matters in graphic design is scale, not a 1:1 representation of whatever is on the screen to its real-life model.

What I mean by that is, so long as the elements in a design maintain the same scale, those elements don’t need to be a 1:1 representation on screen. A 1-inch square doesn’t need to measure exactly 1 inch on the screen, provided a 2-inch square is exactly twice as large as the 1-inch square.

For a real-world example of this, consider a kindergarten classroom.

When you were little, all the furniture in the room seemed normal size. The chairs were the right size, the tables were the right size, everything seemed the right size. The teacher, however, was big. And so was the bus, and the playground.

Today, if you go to the same kindergarten classroom, all the chairs, tables and other items seem tiny.

The reason is scale. When you were little, little things didn’t seem so small, but big things seemed huge. Now that you’re bigger, big things seem normal and small things seem tiny. Your sense of scale changed as you grew up, keeping pace with you as the reference point.

The same is true of graphic design. So long as we fix the size of a given element, and make the size of all other elements in scale with that fixed size, then the design looks right.

Why are Web images 72 dpi?

You can think of resolution as the precision of the image — how sharp and detailed it is. (That’s not 100 percent accurate, but for our purposes, it’s close enough to right to be right.) The more resolution an image has, the sharper and clearer it generally appears in print.

The operative phrase in the preceding sentence is “in print.”

Different kinds of paper can reproduce images within certain ranges of resolution. For example, newsprint is poor quality, so most newspapers print images at about 120 dpi (actually, they print halftones, not images; but I digress). Your home printer generally prints at about 300 dpi for a word processing document, which is more than sufficient to ensure that letters are sharp; your home photo printer probably produces images at 600 dpi, which ensures sharp edges in small photo reproductions. Some magazines may print at about 900 dpi, and lithographs on specialty stock may be printed at 1,200 dpi or more.

But all monitors — CRT, LCD or plasma — all use the same method to produce and image: They illuminate tiny, colored dots to produce a picture. And therefore, they all have the same resolution: 72 dpi.

Why 72 dpi? Because again, to maintain the basic units of scale between print and monitor, a point is a pixel is a dot, and there are 72 points to an inch — which means there are 72 dots to an inch. If we want something that measures one inch wide to be one inch wide on the screen, we need to set its resolution to 72 dpi.

Here are some examples of the effect of different resolutions on the same image. First, a 250-pixel square at 72 dpi.

250px square @ 72 dpi
This square is 250px wide, with a 72 dpi resolution.

Let’s look at the same image at 150 dpi. It should appear on screen slightly more than twice the size of the square above.

250px square @ 150 dpi
This square is 250px at a resolution of 150 dpi.

And here’s the same square at 36 dpi. This time, it’s going to appear to be about half the size of the first square.

250px square @ 36 dpi
This is a 250px square at a resolution of 36 dpi.

Why doesn’t everything measure inch for inch on the screen?

As noted before, not all screens render a 72 dpi image at exactly one inch wide. That’s because all screens can be set to different sizes they will render.

In computer terms, the width and height setting of the screen is called its resolution, but that is not technically true. Changing the size of the screen isn’t really changing its resolution; it’s just changing the size of the things the screen displays — the scale of the items — and therefore either increasing or decreasing the number of things that can be shown: sort of the same way you can show a lot more information on a broadsheet newspaper page than you can on a Post-It note. Computer geeks call different screen sizes “resolution” because it’s convenient to do so, but it’s not true.

The screen only comes with X number of horizontal dots and Y number of vertical dots. If you change the width and height the screen is going to show, that doesn’t increase or decrease the number of horizontal or vertical dots in the screen; it has just as many as it had when it came out of the box, regardless of your settings.

Further, unlike paper, you can’t move the dots any closer or further apart than they are on the screen.

For example, in printing, if the paper is good enough, I can print many tiny dots very close together in order to get a very sharp picture; the better the paper, the smaller the dots I can reproduce and the closer together they can be. Because the number of dots / points in an inch is the true measure of resolution, the more dots I can fit in an inch, the sharper the picture will be in print.

But with a screen, the dots are always in the same place, the same size, and can’t be moved closer together. So, the way a screen manages to handle having its width and height changed is to, simply put, change scale.

For example, suppose a specific model of computer monitor happens to render items 1 for 1 at 800 x 600; that is, if something is 1 square inch, you can put a ruler up to the screen and it will be 1 inch square, if the monitor is set for 800 x 600 pixels.

Suppose we increase the screen “resolution” to 1024 x 768 pixels. The monitor can’t add dots in order increase the amount of space it can show; it has as many dots as it started with, but it has to show more. To compensate for the added space it needs to show, the screen simply reduces the size of everything it is showing by a proportional amount. In out example, the screen would render our 1-inch square as a 0.78-inch square (1 / 1.28 = 0.78) — that is, it simply throws out 0.22 inches of the square.

The screen can get away with doing that because so long as it reduces the size of everything on screen by the same percentage, everything remains in scale — even though everything on screen is actually smaller than it truly measures, in relation to everything else on screen, it’s in proportion, so it looks correct.

7 Comments

  1. @Kaushai: Since all screens, regardless of make, type or resolution, can only render a fixed number of points in a given space, adding twice as many points to be rendered in an image — which is the net effect of changing a 250-pixel square’s resolution from 72 dpi to 150 dpi — forces the screen to scale the higher-resolution image to be twice as large as intended.

    As I noted, that’s because on a computer screen, what matters is not a 1:1 rendering (that is, something which is an inch wide in print / real life actually appears 1 inch wide on screen), but scale — that everything which appears on the screen is the right size, relative to every other item on the screen.

    Therefore, an image that is 250 pixels square at 150 dpi resolution will always render on screen at about twice the size of a 250-pixel square at 72 dpi, on any computer screen of any size, type or manufacture. That is by virtue of physical necessity (only so many pixels exist) and practical application (objects that are twice as big as something else should render on screen twice as big as the smaller object).

  2. No Kaushal is right. If you have 250px then you only have 250 points, pixels, dots, whatever fits the medium you’re using. The dpi then determines how much space those take up. 150 dpi mean 150 dots in an inch. 150 of those 250 pixels go into the first inch, which leaves 100, so it’s less than 2 inches wide. But at 72 dot in an inch, 72 of the 250 take up the first inch. Which leave 178, and 72 of those go into the second inch. So we’re at 106, which is enough for another 1.47 inches. So at 72 dpi 250 px is almost 3.5 inches, while 150dpi is under 2 inches. This types of thing is important usually when going from a file on a computer (uses pixels) to printing it out on paper/etc. (this uses dots and inches).

    Now if you have an image that’s 2 inches or 2 pica, then increasing the dpi does make it bigger on a screen because 2 inches at 72 dpi gives you 144 pixels while those same 2 inches at 300 dpi gives you 600 pixels. This type of thing is important usually when scanning something in, so it goes from inches to pixels.

    If what you are saying is true, then retina, hd, another high resolutions monitors that have a high dpi (or to be more correct, ppi) would be causing problems because everything on their screens would be too big. Trust me, the issue people are running into is that everything is too small.

  3. @Jacq: You’re not seeing the forest through the trees.

    Also, you’re wrong. I made the images as I noted, and simply posted them. WYSIWYG. Your math is simply wrong because it is not how Web browsers interpret DPI.

    But all that is meaningless.

    As this post notes, what matters is scale; not a 1:1 measure. That is, since you cannot have a 1:1 representation of physical dimensions onscreen, because you never know the resolution of the monitor, then you simply decide how big the smallest thing on the screen should be, and then make everything else proportionally larger.

    Again, as the post clearly indicates, I use 72 pixels in an inch because it’s a convenient way to maintain scale.

    If you find that is too small for your desires, by all means, double it. Because, once more and for the final time, what matters is scale, not a 1:1 representation of physical measurement.

  4. sorry Doug, but if you take it to the extreme, lets say @1dpi – 1 dot per inch – a 250 dot wide square would be 250 inches wide – 1 dot per inch. and @250 dots per inch, the same square of 250 dots would only be 1 inch wide (250 dots per inch). The higher the DPI, if you keep the total number of dots in the image the same, the smaller it appears to be. that’s how scale works.

  5. @Steve, that’s not correct. You’ve conflated different screen sizes. I am referring to two different resolutions on the same screen.

    It is true that if I have identical 15-inch screens, one of which is 1280×720 and another which is 1920×1080, the same 250-pixel box is going to appear smaller on the 1920×1080 screen.

    However, on a single screen of any resolution, a box at 250 pixels of resolution is going to be half the size of a box at 500 pixels resolution.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

  • Check out the Commenting Guidelines before commenting, please!
  • Want to share code? Please put it into a GitHub Gist, CodePen or pastebin and link to that in your comment.
  • Just have a line or two of markup? Wrap them in an appropriate SyntaxHighlighter Evolved shortcode for your programming language, please!