We’re studying equations module prime numbers
Recall that to solve this, we have to follow the following steps
- Reduce \(m\) to a prime power, \(m = p^n\) using the chinese remainder theorem. (We did this in Lecture 13: Chinese Remainder Theorem)
- Reduce from a prime power \(p^n\) to a prime \(p\)
- Do the case when \(m\) is prime
In this lecture, we will study several methods to do step (2). These methods include
- The stupid method
- The not so quite stupid method
- Hansel's Lemma and Newton's Method
So Suppose we want to solve
Using the stupid method (aka brute force), we will try all possible values of \(x\) so
This will take about \(O(p^n)\) where \(p^n\) here is \(3^{100}\).
The Less Stupid Method
Let’s try the second method (the less stupid). Instead of taking the equation module \(3^{100}\), we’ll instead start with the lowest power and work our way up. So
In particular, let’s start with the lowest power possible,
We can solve this by trial and error since it’s a small power. The solution is \(x \equiv \pm 1 \pmod{3}\). But let’s focus on one of them, say \(x \equiv 1 \pmod{3}\). Next, we want to solve
We will “lift” the solution module \(3\) to the solution module \(9\). We know that \(x \equiv 1 \pmod{3}\) is a solution to the first equation. This means that \(x = 3k + 1\). So
Let’s square each potential value of \(x\)
We can see here that \(x = 4\) works since \(4^2 \equiv 7 \pmod{3^2}\). This is also the only solution. We can now we repeat the same process to solve
We know that \(x \equiv 4 \pmod{9}\) is a solution to the second congruence. We will lift this solution to the solution module \(27\). We know \(x = 9k + 4\). So we can try the values
Let’s square each potential value
So the solution is \(x \equiv 13 \pmod{27}\). We can keep going like this until we reach the solution to
How long does this algorithm take? It will take \(O(pn)\) so much faster than \(O(p^n)\).
Hansel's Lemma
So notice how we started with a solution modulo \(p^k\) and then we lifted it to a solution modulo \(p^{k+1}\) to get a new solution.
In the previous example, saw that we could lift every time and it was always unique. However, this is not always true. Let’s do another example where this might not be true. Consider
We start with the lowest power:
Here the solution is \(x \equiv 1 \pmod{2}\) since \(-1 \equiv 1 \pmod{2}\). Next, we take the next power:
We have two solutions: \(x \equiv 1 \pmod{4}\) and \(x \equiv 3 \pmod{4}\). Next, we want to solve
Obviously the solutions are \(x \equiv 1,3,5,7 \pmod{8}\). But suppose we are lifting the solutions modulo \(4\) to the solutions modulo \(8\). Suppose that we started with \(x \equiv 1 \pmod{4}\), then \(x = 4k + 1\). So we can try
We have two different solutions here (aka, it is not unique). Similarly, if \(x \equiv 3 \pmod{4}\), then \(x = 4k + 3\). So
Again, the solution is not unique. It is either \(3\) or \(7\). So now the algorithm will have to try each solution when it’s lifted to the next level which will take even more time. That’s for uniqueness. Are we guaranteed a solution every time? suppose we continue to the next level
Let’s now try the lifted solution \(x \equiv 3 \pmod{8}\) that we got earlier. This means that \(x = 8k + 3\), so we want to try \(x = 3\) and \(x = 11\) but
So neither solution works! So we don’t no solutions modulo \(16\). So now we have a tree that looks like this
[TODO]
Is there way to know if we can lift solutions early on? before we go and waste our time going all the way to say modulo \(2^{99}\) and discover that we have to stop? Yes. It is using Hensel’s Lemma.
Hansel's Lemma
Suppose we have solved the congruence:
So \(x_1\) is a solution module \(p\). Now, we want to lift this solution to a solution module \(p^2\). That is, we are looking for a solution of the form
Such that
Here \(a\) is the unknown integer we are trying to find. Let’s apply Taylor’s Theorem to see that
But we’re working module \(p^2\) so all the terms after the linear term will vanish module \(p^2\) (IMPORTANT: there is a justification for the later terms like \(\frac{(ap)^2}{2!}\) being divisible by \(p^2\) and hence vanishing. I didn’t write notes on this. tldr is “the factorial at the bottom cancels out with the derivative on top and the whole fraction is actually an integer). So now we have
We also already know that \(f(x_1) \equiv 0 \pmod{p}\) so \(f(x_1) = pk\) for some \(k\) and we can write
Since we’re working module \(p^2\), we can divide by \(p\) to get
Solving for \(a\), we get
This is only solvable if \(f'(x_1) \not\equiv 0 \pmod{p}\) (so the inverse exists). In particular, Hensel’s Lemma says that
What does this mean? If we were to solve
Then, we start with \(n = 1\). We check if there is a solution to
We do in fact have the solutions \(x \equiv \pm 1 \pmod{3}\).
Now, pick \(x_0 = 1\) and define \(f(x) = x^2 - 7\). Then
Since \(f(x_0) \equiv 0 \pmod{3}\) and \(f'(x_0) \not\equiv \pmod{3}\), Hensel’s lemma applies and guarantees that this solution \(x_0 = 1 \pmod{3}\) can be lifted to a solution module \(3^n\) for any \(n \geq 1\).
Observe now that when we tried solving
We attempted to lift the solution \(x \equiv 3 \pmod{8}\) to a solution module \(16\). We saw that neither solution worked since neither were equivalent to \(1\) module \(16\) in
Now consider \(f(x) = x^2 - 1\). Then
Here, Hensel’s Lemma can’t apply so we are not guaranteed that we can lift the solution \(x \equiv 3 \pmod{8}\) from module \(8\) to module \(16\).
Hensel’s Lemma takes around \(O(n)\) steps. So much less than \(O(pn)\).
Newton's Method
You have seen Newton’s Method before in Numerical Analysis or Calculus. It is an iterative algorithm for finding roots of real-valued functions.
We start with a guess \(x_0\) and then and improve it using the tangent line at that point with
We can analyze this new guess using Taylor’s Theorem in
We first note that the first two terms cancel with each other. The remaining terms are
So this is Newton’s Method. Suppose that we know that \(x\) is a solution module \(p^n\), i.e.,
Then
and further suppose that \(f'(x) \not\equiv 0 \pmod{p}\).
Now suppose that we apply Newton’s Method above and consider the second term so
The first term is an integer. The second term can be written as
So it is divisible by \(p^{2n}\) if \(f'(x) \not\equiv 0 \pmod{p}\) which we have assumed. So all the terms will vanish modulo \(p^n\). In general
If \(f'(x) \not\equiv 0 \pmod{p}\). So the conclusion is the same. If \(f(x) \equiv 0 \pmod{p^n}\) and \(f'(x) \not\equiv 0 \pmod{p}\), then the Newton’s update
is a solution modulo \(p^{2n}\). Note here unlike Hansel, we’re doubling the exponent and not just adding \(1\).
Example
TODO