Games
Problems
Go Pro!

Ask Professor Puzzler


Do you have a question you would like to ask Professor Puzzler? Click here to ask your question!

"I have two questions. First, why are irrational numbers called irrational? Doesn't that mean "not logical?" Second, how do we know that the square root of 2 is irrational?" 11th grader Howard

Well, Howard, you're right that "irrational" means "not logical," but that's not the only meaning the word can have. Instead of breaking the word down as ir-rational, break it down this way: ir-ratio-nal. Instead of meaning "not rational," the word "irrational" means "not a ratio." And this is actually the definition of an irrational number; it is a real number which cannot be written as the ratio (or fraction) of two integers.

We often describe irrational numbers as numbers which have a non-repeating, non-terminating decimal, but that's just a description, not the definition of an irrational number; it's a consequence of the fact that it can't be written as a ratio of integers.

So when we say that the square root of two is irrational, we're saying there are no two integers that when you divide one by the other, we get the square root of two.

Can we prove that the square root of two is irrational? Yes, we can. I'll provide you with an informal proof in the following paragraphs.

We'll begin by assuming that the square root of two is not irrational. In other words, that it can be written as a ratio of two integers. If we make this assumption, and then obtain a contradiction, then we'll have proven that our assumption was wrong. In other words, we'll have proved that the square root of two is irrational.

Since we're assuming that the square root of two is rational, then that's the same as assuming that there exists a fraction in simplest form a/b such that a/= SQR(2). Note that simplest form implies that a and b are relatively prime integers. If we square both sides of this equation, we obtain the following:

a2/b2 = 2, or a2 = 2b2.

Since b (and therefore b2) is an integer, we can conclude that a must be a multiple of 2, since it is equal to twice an integer.

Since a2 is the square of a multiple of 2, it is a multiple of 4. (as a side note, you might find this interesting: all perfect squares are either a multiple of 4, or one more than a multiple of 4.)

But by the same reasoning above, we can conlude that therefore, b must be a multiple of 2, because when multiplied by 2 it results in a multiple of 4.

Aha! Here's the contradiction we needed; we started with the assumption that a and b were relatively prime, and have ended up proving that both a and b are multiples of 2, which means they are not relatively prime.

Since we've created a contradiction, we've proved our original assumption wrong, from which we can conlude that the square root of two must be irrational.

Blogs on This Site

Reviews and book lists - books we love!
The site administrator fields questions from visitors.
Like us on Facebook to get updates about new resources
Home
Pro Membership
About
Privacy