But that's a simple example because "floob" is a nonsense word. Things get trickier when you define a word that has colloquial meaning, or when you give a single word multiple definitions. If you switch between two definitions in the middle of an argument, you've committed the equivocation fallacy. This is described with a famous quote from Through the Looking Glass:
"When I use a word," Humpty Dumpty said in rather a scornful tone. "It means just what I choose it to mean - neither more or less."(Humpty Dumpty then proceeds to define the words that appear in "Jabberwocky".)
"The question is," said Alice, "whether you can make words mean so many different things."
"The question is," said Humpty Dumpty, "which is to be master - that's all."
But even if you stick to one definition, there are ways that a definition can be wrong. Here I present three different ways I think a definition can be wrong. This list may not be exhaustive.
1. A definition is descriptively wrong if it does not match the way the word is used. For example, if I define "glory" as "a nice knockdown argument" (Lewis Carroll's example), this is wildly different from the way "glory" is commonly used. If we say a dictionary is wrong, then what we mean is that it is descriptively wrong. After all, the entire purpose of a dictionary is to describe the ways words are used.
2. A definition is wrong in application if someone purports to define a word in one way, but in practice uses it another way. For example, if someone defines a pencil as a long thin piece of wood with a tube of graphite inside, but later refers to a mechanical pencil as a pencil, then their definition was wrong in application. Either they failed to entirely describe what they think of as a pencil, or they were mistaken to include mechanical pencils in that category.
3. A definition is morally wrong if we judge that using that definition will lead to harm. For example, I might say that it is morally wrong for altmed people to talk about "energy fields", not because physics has a monopoly on the words "energy" or "field", but because those words lend an air of science to something undeserving.
Note that we may still accept definitions if they are wrong in some ways. For instance, we may use descriptively wrong words if we're trying to change the language. It is appropriate to include morally wrong definitions in a dictionary.
Whenever you call a definition wrong, a fun exercise is to determine the exact way in which it is wrong.
1 comment:
I'd say that another way for a definition to be "wrong" is if it obscures the subject matter when a better definition would produce a clearer understanding of the issues. Some examples from math:
1) As a child, I often wondered why 1 isn't prime. I was taught that a prime number has two factors: 1 and itself, and 1's only factors are 1 and itself. And there is a reason: making 1 prime would mean that the theorems involving prime numbers would need a bunch of exceptions for 1. So making 1 prime would obscure what's speical about prime numbers.
2) Some critics have argued that half the circumfrence of the unit circle is a poor choice of definitions for pi, and pi should have been defined as the circumfrence of the unit circle. (The symbol tau is proposed for that number.) The principal idea is that 2*pi creates a bunch of unnecessary 2's and this obscures the geometrical intuition that you making a "turn around the circle." For example, C = 2*pi*r is inelegant, whereas the circumfrence equation is better written as C = tau*r. Similarly, writing Euler's equation as e^(i*tau) = 1 has a simple geometrical meaning: going all the way around the unit circle brings you to the same point you started at. By comparison, the geometrical meaning of e^(i*pi) = -1 is more obscure. I think this is a pretty good set of criticisms.
Post a Comment