You ever get a metaphor wrong? You know, you hear an expression and then use it without really thinking it through? All these years, I've been thinking the term "lowest common denominator" was a pejorative term often used when critics speak dismissively of low-brow culture. But today I learned that I'm wrong.
I've been reading Andrew Hodges new book One to Nine: The Inner Life of Numbers. [Hodges also wrote Alan Turing: The Enigma.] In his chapter about the number Nine, Hodges discusses the term "lowest common denominator," a term that many associate with something that is "small and cheap." But we've got it all backwards:
"A lowest common denominator is generally a rather grand number! This term arises in the context of fractions p/q, where p is called a numerator and a q a denominator.
"To add 1/1 + 1/2 + 1/3 +1/4 + 1/5 + 1/6 + 1/7 + 1/8 + 1/9, the trick is to find the lowest common denominator from 1, 2, 3, 4, 5, 6, 7, 8, 9. That is the smallest number which can be be divided exactly by all of 1, 2, 3, 4, 5, 6, 7, 8, 9 — which turns out to be 2,520. You are supposed to write 1/2 as 1260/2520, 1/3 as 840/2520 . . . 1/9 as 280/2520 and so add them up. [Amazing stuff: as it turns out the lowest common denominator is a special number.]
"In practice, few people add up fractions much more complicated than 1/2 hour + 1/4 hour, and I am not surprised that this complicated theory has left little trace on the collective mind. Probably the misuse of the metaphor [of 'lowest common denominator'] has arisen by people confusing it with the 'highest common factor' of a set of numbers, and this is typically something modest."
That's my lesson for today: over the years, the term "lowest common denominator" has gotten an undeserved bad reputation because it has been confused with something else.
Question: what other commonly used metaphors actually mean something quite different from the ideas we usually associate with them?