90

If you calculate the area of a rectangle, you just multiply the height and the width and get back the unit squared. Example: 5cm * 10cm = 50cm²

In contrast, if you calculate the size of an image, you also multiply the height and the width, but you get back the unit - Pixel - just as it was the unit of the height and width before multiplying. Example: What you actually calculate is the following: 3840 Pixel * 2160 Pixel = 8294400 Pixel

What I would expect is: 3840 Pixel * 2160 Pixel = 8294400 Pixel²

Why is that the unit at multiplying Pixels is not being squared?

Raphael
  • 72,336
  • 29
  • 179
  • 389
JFFIGK
  • 1,017
  • 1
  • 8
  • 11
  • Comments are not for extended discussion; this conversation has been moved to chat. – D.W. Feb 14 '18 at 17:27
  • 1
    @JLRishe Please don't post answers as comments, especially when the point you're raising has already been covered by the existing answers. The question was protected precisely to prevent this kind of repetition. – David Richerby Feb 19 '18 at 17:01
  • The answer depends on the definition; Wikipedia gives several. It is however fairly clear which is meant here. It is also more a question about conventions of usage than about logical necessity. – PJTraill Mar 21 '22 at 09:57

11 Answers11

239

Because "pixel" isn't a unit of measurement: it's an object. So, just like a wall that's 30 bricks wide by 10 bricks tall contains 300 bricks (not bricks-squared), an image that's 30 pixels wide by 10 pixels tall contains 300 pixels (not pixels-squared).

David Richerby
  • 81,689
  • 26
  • 141
  • 235
  • 19
    “pixel” is actually often used as, and interchangeable with, other units of measurement. The most prominent example being in CSS. Dimensional analysis plays no role in these applications but if it did, we’d probably indeed (have to) use pixel². – Konrad Rudolph Feb 13 '18 at 18:38
  • 15
    pixel is already an measure of an area (like acre or football field). It is defined by the smallest controllable area of a display. This area is small on a monitor and can be large on a billboard. Assume a pixel is 2mm wide and 1mm high -> pixel=2mm^2 A screen 800x600 is 8002mm600*1mm 960000mm^2 but always 460000 pixel – Joachim Weiß Feb 14 '18 at 07:49
  • 8
    I'd say it's not a one-dimensional unit of measurement. Every object can be used as unit of measurement for a quantity in its own dimension, or lower. Pixels are two-dimensional, so we can use it to count out areas (but also lengths, by abbreviating "as long as the sides of 1080 pixels"). – Raphael Feb 14 '18 at 10:13
  • @JoachimWeiß I tend to agree that it should be so, however the widespread CSS use of the unit px is actually a measure of length, not area. – leftaroundabout Feb 14 '18 at 11:18
  • @leftaroundabout not quite... an area of 10 px height and 10 px length must not be rendered as a square. Its more like Lego: a px is a Lego tile in the best case a 2x2 but it can be a 2x4 brick. If you define 10 height it is 20 "spots" height but it has always a width. – Joachim Weiß Feb 14 '18 at 12:13
  • +1 for the brick analogy and putting that picture in my head! Made me laugh :D – CodingInCircles Feb 14 '18 at 21:23
  • 20
    The London Bus is also a popular unit of width, height, length, and volume, but no-one would ever speak of a cubic Bus. – Michael Kay Feb 14 '18 at 22:03
  • 4
    @MichaelKay Weight, rather than volume. Volume is measured in Olympic swimming pools. :-P – David Richerby Feb 14 '18 at 22:10
  • 3
  • 2
    Right, a pixel is a sample or a physical light emitting unit, depending on if your looking at abstraction or hardware. So an object, not a unit. And this causes a lot of problems since people also think its a good candidate for a unit like in CSS, which its not. – joojaa Feb 16 '18 at 06:50
  • 1
    Short and to the point with one exception: Bricks would actually be voxels, not pixels. – NoDataDumpNoContribution Feb 16 '18 at 14:40
  • 2
    IMHO, this explanation is the simplest, and therefore the best.

    @Trilarion - If you remove the z axis when you look at a brick wall, you wind up with pixels. Consider that the brick wall is 1 brick wide, then you can for the purposes of this example, look at them like pixels.

    – Scuba Steve Feb 16 '18 at 21:01
  • Why is it not appropriate to view Pixels or bricks from the unit perspective? Why do they have to contradict? – JFFIGK Feb 19 '18 at 16:49
  • @JFFIGK "brick-squared" has no physical meaning. You're not measuring something in bricks: you're counting a number of bricks. In contrast, you don't count the number of meters; you measure the wall in meters. – David Richerby Feb 19 '18 at 16:59
  • @DavidRicherby I disagree: For instance, I can measure the volume of the wall in bricks. Indeed, Bricks^2 makes no sense for me neither at this time. But the idea of measuring the length of a wall in brick widths makes sense again. – JFFIGK Feb 19 '18 at 18:28
  • @JFFIGK You can measure the volume of the wall and express that as a multiple of the volume of a brick. Or you can count the bricks in the wall. But brick-volumes and brick-widths aren't bricks. – David Richerby Feb 19 '18 at 18:48
  • If I express the volume of sth with bricks, would that not induce that I measure something with bricks? – JFFIGK Feb 19 '18 at 19:08
  • 1
    @leftaroundabout The CSS length unit px is not (or no longer) related to the pixel = picture element. Meaning: For some devices, a screen area that is 10px wide and 10px high may take more than 100 RGB data points. – Hagen von Eitzen Feb 21 '18 at 11:43
122

I have a different answer from other folks: pixel is the correct unit for areas, and you do need dimensional analysis. The discrepancy is that the pixel in "3840 pixels wide" is not the same unit as the pixel in "the display has 8294400 pixels". Instead, "pixel" is a natural-language abbreviation for different units at different times, and it takes some context and judgment to expand the abbreviation appropriately.

The unabbreviated form is "3840 pixel-widths wide x 2160 pixel-heights tall = a bazillion pixel-areas" (and one "pixel area" is definitionally equal to "pixel-width * pixel-height" for rectangular pixels).

N.B. it is frequently assumed that pixel width and pixel height are equal (as in the CSS discussion in the other answer), and even without that assumption the above assumes that pixels are rectangular -- and these assumptions are often but not always true!

Daniel Wagner
  • 1,258
  • 1
  • 8
  • 12
  • 4
    I think you’re historically correct: see for instance the usage of twips in VB6, which was a device independent unit that converts to device dependent X- and Y-pixels (and the twips/X-pixel and twips/Y-pixel conversion factors depended on the canvas scale and were generally not identical). – Konrad Rudolph Feb 13 '18 at 18:54
  • 18
    In fact the unit of width and height is *√pixel, multiplying together gives you pixels* :o) – Will Crawford Feb 13 '18 at 19:47
  • 27
    @WillCrawford You have fallen into the assumption that pixels are square! Or else you don't want the square root. =) – Daniel Wagner Feb 13 '18 at 20:46
  • I love the idea of Polar Pixels/Voxels for when you want to map onto/into a sphere, as opposed to Cartesian Pixels/Voxels. In fact, checking, I see that Polar Voxels can be useful for tomographic image reconstriction. – Mark Booth Feb 14 '18 at 12:49
  • 18
    Pixels aren't a distance or an area per se. They're discrete objects. – Shufflepants Feb 14 '18 at 15:24
  • 5
    Too bad this is so heavily upvoted when it's simply wrong. A pixel is a single, unresolvable, element of a sampled image. No more, no less. The fact that pixels when realized physically, such as LCD elements or CRT colorbars, have an area, or an equivalent image solid angle, is secondary. When a display is K*J pixels, the entire display has that many pixels, but you do not (yet) know anything about its physical dimensions. (yeah, yeah, images are 2-dimensional, so pixels must have two dimensions unless you're into Peano Curves) – Carl Witthoft Feb 14 '18 at 18:49
  • 6
    @CarlWitthoft I'm not sure I understand which part of my proposed answer you're disagreeing with, exactly. It is common for there to be varying conversion factors between units; e.g. if I have $100, how many euros do I have? Dunno until I go to exchange them -- could be different today and tomorrow. Same with pixel dimensions and physical dimensions: it's true that I don't know how many square inches a pixel is, but it is still sensible to talk about how many pixels I have. And anyway no part of this answer discusses physical dimensions, so what conflict has been set up by your statements? – Daniel Wagner Feb 14 '18 at 18:54
  • For one thing, it's acceptable to describe the minimum resolvable region (square, rectangular, or whatever) of a continuous image as a pixel. For another, a physical implementation of a pixel has identified length and width, but its basic definition is as a portion of an image of undefined size. – Carl Witthoft Feb 14 '18 at 20:10
  • 3
    @CarlWitthoft I still don't see where are you are disagreeing with this answer. – mbrig Feb 14 '18 at 20:27
  • @DanielWagner if you have $100 you don't have any euros. What happens at a Bureau de Change is a transaction, not a simple change of unit of measurement. The exchange rate has units of $/€, which is not dimensionless. – bdsl Feb 14 '18 at 23:12
  • 3
    @bdsl Right. And the exchange rate of pixels/square inch is also not dimensionless. – Daniel Wagner Feb 15 '18 at 00:41
  • I'm still a little confused about this answer... Is it the same for other currencies, like Yen or Rupee? I'm traveling to Japan in May and I need to bring my laptop along. – Brad Werth Feb 15 '18 at 22:02
  • This answer is just wrong on so many levels and the logic behind your comments is so broken.. units for areas or dimensions need to be absolute, consult physics or math. Money exchange is not conversion of units. Pixel is short for Picture-element which is literally an element of 2D raster - it has nothing to do with areas or units. Do not bring your philosophical "natural-language" nonsense into exact sciences. – Jerryno Feb 19 '18 at 13:12
  • E.g. when the PAL analog TV signal is digitized, the resulting pixels are not square (per standard). – Pablo H Feb 19 '18 at 18:57
  • @Jerryno I think you are overstating it. Cf. the mole for a somewhat analogous situation in the "exact sciences". – duplode Feb 21 '18 at 13:43
  • @duplode mole is atomic weight divided by a constant - so an absolute unit. Pixel is not an unit, nor is pixel-width or pixel-height. It has nothing to do with dimensions. – Jerryno Feb 21 '18 at 15:10
  • @Jerryno A mole is an amount of substance that contains a certain number of particles. There is a conversion factor to kilograms (the molar mass), but it is substance-dependent. In fact, there are plans to remove the link to the kilogram from the definition of mole. Also (and going back to pixels, lengths and areas) cf. the second part of Nat's answer. – duplode Feb 22 '18 at 04:16
  • @duplode If that happens to mole, it will no longer be valid unit, but a substitute for count. Nat's answer is wrong, don't link that, units must be absolute in the coordinate system we measure in. When people realized every feet is different it was no longer units. That's why pixel is not an unit (which Nat writes), but dip is. The dev deals with dips and the underlying OS uses displays ppi to display correct size. A Dip has ppi assigned to it, and the OS makes corrections based on it's device ppi. How much IQ does it take to understand this easy matter? Probably more than I first thought.. – Jerryno Feb 22 '18 at 08:54
  • Where can I find the criteria for a valid unit? – JFFIGK Feb 22 '18 at 10:18
  • @Jerryno [1/2] (1) No, the mole will keep being a valid unit for amount of substance, which will continue be a meaningful quantity. No chemist is going to lose sleep over that. (2) There is nothing relative in saying "this PNG image is 168 pixels wide", as we are dealing with an image size (a valid quantity on its own), and not a physical size (a different quantity, which only comes into play when e.g. displaying the image in a screen). (3) People always knew that every feet is different -- it's just that, in older times, the need for an universal standard was not as pressing. – duplode Feb 22 '18 at 13:19
  • [2/2] (4) "Where can I find the criteria for a valid unit?" -- @JFFIGK. I think we could do worse than quoting International Bureau of Weights and Measures documents. Cf. the brochure The International System of Units (SI) (p. 103: "The unit is simply a particular example of the quantity concerned which is used as a reference"; cf. Nat's "anything we use to express measurements"), and the definition in the International vocabulary of metrology (p. 6). – duplode Feb 22 '18 at 13:19
  • @duplode Quantity has nothing to do with units. I give up, there is no point in this, keep living your life good sir. Similarly like a flat-earther would. – Jerryno Feb 22 '18 at 15:01
  • @Jerryno it is not helping at all if you strictly express your disagreement without a logical driven explanation based on generally accepted facts. – JFFIGK Feb 22 '18 at 16:21
  • @Jerryno By quantity, I mean "property of a phenomenon, body, or substance, where the property has a magnitude that can be expressed as a number and a reference" (as in p.2 of the second IBWM document I linked to above). I can see how there might be some confusion between that and the ordinary meaning of "quantity" -- in Portuguese, for instance, there is a quite different word ("grandeza") for this technical sense, and so such mix-ups are unlikely to arise. – duplode Feb 23 '18 at 02:47
  • @JFFIGK Yeah I know my last comment is not helping and it is by design, because I decided no longer to try to help.. isn't it obvious Sherlock? Or has every comment have to help? The other party is not capable of logic. I already explained why you can't measure in pixels and why it's not the same as measuring in feet. Isn't it weird on it's own that pixel² is not used if it's a dimensional unit? All the academics and scientist who created it must be wrong, right? Try it yourself to explain why and you will also see that the effort is futile. – Jerryno Feb 23 '18 at 09:08
44

A pixel is already a two-dimensional object

In your example, you specify centimeters as a contrasting example. Centimeters are a unit of length, which is by nature a one-dimensional measurement. When measuring areas, we need to talk about square centimeters, which defines the unit as a two-dimensional quadrilateral with right angles and equal length sides = 1cm. When discussing volume we then talk about cubic centimeters, which defines the unit as a three-dimensional, six sided prism with squares for sides, the length of each side = 1cm.

Since a pixel is an abstract concept, and not a strict unit of measure, it makes sense for it to exist purely as a two-dimensional object. When you're measuring something like screen resolution, you could consider the measurement to include an implied pixel widths. ex: 1920 pixel widths x 1280 pixel widths

As an interesting note, there is a three-dimensional pixel, called a voxel, which is defined as a three-dimensional prism with a pixel for each face.

@KlaymenDK Had a great note, which is that pixels are so abstract and context sensitive that they are not necessarily square. This pushes their class further from 'unit of measure' and into 'count of objects' territory.

Skoddie
  • 541
  • 4
  • 7
  • 12
    I really think this is the authoritative answer: pixels are two-dimensional. Note also that, historically and currently, pixels are not always square - or even identical on a unit level (for instance, PenTile displays muddle what a single pixel really is). – KlaymenDK Feb 14 '18 at 11:17
  • 2
    pixels are not an "abstract concept", they are a physical object. Centimeters and other units of measure are the abstract concept. – Stephen Ostermiller Feb 15 '18 at 12:39
  • 3
    @StephenOstermiller this is incorrect. Centimeters have a fixed physical definition as 1/100th of a meter. A meter is defined as the length of the path travelled by light in a vacuum in 1/299,792,458th of a second. Originally it was meant to represent 1/10,000th of the distance between the north pole and the equator but this was difficult to gauge. By contrast, the pixels on my current screen are of both a different size AND shape as the pixels on my phone. Software interprets them, and hardware renders them to a local environment. Centimeters are a physical standard, pixels are abstract. – Skoddie Feb 15 '18 at 17:18
  • 3
    "Physical" mean something that is actually there that you can see. It doesn't mean "standard". It would be more accurate to say "Since pixels vary in size, and are not a strict unit of measure..." – Stephen Ostermiller Feb 15 '18 at 17:51
  • 3
    @StephenOstermiller This is an issue of semantics. A software developer would often interpret a pixel as being more abstract than a centimeter because the measurement of one is known, while the other varies based on device. The correct use of verbiage here will vary depending on the individual's perspective, and I wrote the answer from my own. – Skoddie Feb 15 '18 at 18:34
16

As others have said it's because pixel is an object. If you'd like to think of it in terms of units, the equation is technically 3840 pixel-heights * 2160 pixel-widths = 8294400 pixels (where you can think of a pixel as 1 pixel-height * 1 pixel-width)

reffu
  • 411
  • 4
  • 7
  • 5
    I think the 'brick wall' analogy is best - nobody uses K*J bricks^2 to build a wall:-) – Carl Witthoft Feb 14 '18 at 18:46
  • @CarlWitthoft that's true, and I agree that analogy is better, but if someone is set on thinking in terms of units, it's helpful to realize that "10 pixels high" is really shorthand for something like 10*(1 pixel-height) – reffu Feb 14 '18 at 19:27
12

Pixels are weakly typed units. Just like 1 can be coerced into an integer, floating-point value, or string in a weakly typed language, a "pixel" is coerced into whatever unit makes sense in context.

If we were to more strongly type the unit, we'd probably have several:

  1. pixel-width;

  2. pixel-height;

  3. pixel-diagonal; and

  4. pixel-area.

As you correctly point out, if we assume that pixels are squares, then$$ \left[\text{pixel-area}\right] = {\left[\text{pixel-side}\right]}^2, $$such that it'd make more sense to speak of square-pixels when discussing image size in terms of pixel-sides.

The thing's just that, if we're talking image size, then a "pixel" is meant to be coerced into a pixel-area, not a pixel-side.

Note: Pixels are units

A unit is literally anything we use to express measurements of some sort. Sure pixels are objects, but so are other units - there's no conflict there. For example, in the US, we still measure lengths in feet.

Object-defined units of measurement were the historical norm, so there's nothing unusual about pixels being object-defined units. Just, there're obvious shortcomings to such definitions, so in recent history standardization efforts have been made. For example, feet are now more formally defined than as a person's foot's length.

That said, the same is happening to the pixel unit:

A device-independent pixel (also: density-independent pixel, dip, dp) is a physical unit of measurement based on a coordinate system held by a computer and represents an abstraction of a pixel for use by an application that an underlying system then converts to physical pixels.

-"Device-independent pixel", Wikipedia

Nat
  • 1,351
  • 1
  • 10
  • 18
  • 1
    But the foot (as used for linear measurement) isn’t an object! It’s defined to be some fraction of a meter and the meter is defined in terms of the wavelength of light. In Britain, the foot has been officially defined with respect to a reference bar for over a thousand years, and anything based lengths of actual feet has been just a local approximation; likewise, the ancient Greeks defined the foot independently of actual feet. – David Richerby Feb 14 '18 at 11:57
  • 1
    @DavidRicherby They've been redefined in terms of of the speed of light in a vacuum (the wavelength of light varies, e.g. blue light has a different wavelength than red light), but the original definitions were just use-based. Folks needed to express lengths long before modern standardization, so things could be expressed in terms of body parts, paces, common objects like barley grains (to grab a random definition of "inch" from Wikipedia), etc.. Measurement units and counting objects aren't separate concepts; the notion of "length" itself is basically a concatenation of things. – Nat Feb 14 '18 at 12:46
  • @DavidRicherby Hah I checked Wikipedia; you're right, they did define a "meter" in terms of a particular type of light's wavelength from 1960 to 1983! Actually kinda fun to read up on the history of this stuff. – Nat Feb 14 '18 at 13:01
  • Well, actually, I was mistakenly referring to the current definition! I was confusing it with the second, which is defined as a fixed number of cycles of a specific frequency of light. Anyway, the general point stands: the meter is defined in terms of some property of light. – David Richerby Feb 14 '18 at 13:14
7

The width of an image is being measured in a discrete 2 dimensional space.

The image is 3840 pixels across. This means there is a horizontal band of 3840 pixels (which are each 2 dimensional regions) that cross the space. We aren't using pixel as a unit of measurement -- we are actually counting things called pixels.

When we measure how tall it is, we measure 2160 pixels in a vertical band to the top of the image. Again, pixel isn't a unit of measurement, it is a thing we are counting.

If you take a grid of things that is 3840 wide and 2160 tall, you end up with 3840*2160 of them. This is counting.

We could also describe the image as 3840 pixel_widths wide and 2160 pixel_heights tall, and modify those distances. Then we'd get 3840*2160 (pixel_width * pixel_height) area. This is an area calculation.

These happen to have the same numerical value because pixel_width*pixel_height = pixel_area, and X pixels have an area of X pixel_area.

A difference between these calculations appears when you have non-square pixels and you rotate. Something 10 pixel_widths wide rotated 90 degrees may not be 10 pixel_heights tall. At the same time, the rotation should preserve area (up to rounding).

The ratio between width and height on a pixel is called its aspect ratio. CRTs often had an effective aspect ratio of 1.11 if I remember correctly.

Yakk
  • 824
  • 4
  • 12
6

You may want to look at this in another way:

$$3840\, \frac{\mathrm{px}}{\mathrm{scanline}}\times2160\, \mathrm{scanline}=8294400\,\mathrm{px}.$$

I.e. treat pixels as two-dimensional objects, much like bricks in a wall, and just count them along the display width and height.

Ruslan
  • 169
  • 5
  • What new insight does this answer add that hasn't already been said? Multiple answers already noted that pixels are individual objects, most often already of dimension 2. – Discrete lizard Feb 15 '18 at 08:14
  • 4
    Simple: it adds scanline as a unit which still plays nicely with dimensional analysis. – Ruslan Feb 15 '18 at 09:11
  • I thought the scanline bit added new, relevant and useful information. This makes much more sense to me than the brick wall analogy. – peaceoutside Feb 16 '18 at 01:58
4

A pixel on a screen is like a block in your city. You don't say I love the houses on this city block² since block is by definition itself the square, not the side of the square.

user541686
  • 1,167
  • 1
  • 10
  • 17
3

Pixel are a discrete unit you want to know the count of. Practically, it is like counting potatoes in several bags. Use the other length as plain number, so eg. 1600*900 Pixels, but not 1600 Pixels*900 Pixels. Leaving one number without unit yields Pixels as expected (instead of Pixels-squared) and is mathematically correct.

rexkogitans
  • 217
  • 2
  • 6
1

Technically a pixel is not a unit of measurement. In optical design pixels are measured in radians. In photogrammetry, particularly when performing interior orientation, pixels are measured in millimeters. For example, if you have a sensor with 5micron pixels and you talk about a line of pixels 1000 wide, those pixels are not 5mm wide by 1mm or 0mm tall. They are 0.005mm tall. Pixels are inherently 2 dimensional, sort of. When you include tonal information pixels are actually 3 dimensional. 4th, 5th, or 6th dimensional pixels are, of course, voxels. Within the realm of hyperstacks, each nth dimension is defined and can decompose into into its constituent pixels across any dimensionality.

To think about this more concretely (pun intended) if you have a cinder block wall and the blocks are 9x18 inches then a 10x10 block wall will be 90x180 inches or 10 blocks wide and 10 blocks tall. I suppose that if you had a 1000x1000 block wall you could say that you had a megablock but then there would need to be a base2 / base10 debate.

Bottom line, those pixel dimensions might be convenient integers when stored in an image file but in the real world a single pixel can have any number of dimensions greater than 2 and have sizes of any precision.

-2

It happens when you multiply units. Pixel is not a unit. You can do that with Pixel per Inch etc. but not with the pixel alone.