0

I'm trying to calculate this expression $e^{i2\pi ab}$ where $a$ is an integer and $b$ is a real number.

One of the laws of exponents states that $a^{bc}=\left(a^b\right)^c$.

Applying that law gives us $e^{i2\pi ab}=\left(e^{i2\pi a}\right)^{b}$. Furthermore we know that $e^{i2\pi n}=1 $, $ \forall n \in \mathbb Z$.

Which means that $\left(e^{i2\pi a}\right)^{b}=1^b=1$. However this is wrong for $b\in \mathbb {R}$. In which of the steps have I made en error?

Krippkrupp
  • 25
  • 4

1 Answers1

2

That "law of exponents" you cite --- where'd it come from? Algebra 1, where you learned it worked for integer values of $b$ and $c$? Calculus, where you learned it worked for real values of $b$ and $c$ (and maybe learned that they had to be positive for it to actually make sense)?

Have you ever seen it stated or proved for complex numbers? Probably not, because it's not (generally) true when $a$ or $b$ is complex. Your question provides a nice example of the kind of failure that can occur in the complex case.

So that's where your argument goes wrong: you've overgeneralized something you learned earlier, without confirming that the generalization was valid.

John Hughes
  • 93,729