We’re studying equations module prime numbers

$$ \begin{align*} f(x) \equiv 0 \pmod{m} \end{align*} $$

Recall that to solve this, we have to follow the following steps

  1. Reduce \(m\) to a prime power, \(m = p^n\) using the chinese remainder theorem. (We did this in Lecture 13: Chinese Remainder Theorem)
  2. Reduce from a prime power \(p^n\) to a prime \(p\)
  3. Do the case when \(m\) is prime

In this lecture, we will study several methods to do step (2). These methods include

  1. The stupid method
  2. The not so quite stupid method
  3. Hansel's Lemma and Newton's Method

So Suppose we want to solve

$$ \begin{align*} x^2 \equiv 7 \pmod{3^{100}} \end{align*} $$

Using the stupid method (aka brute force), we will try all possible values of \(x\) so

$$ \begin{align*} x = 0 \text { to } 3^{100} - 1 \cdots \end{align*} $$

This will take about \(O(p^n)\) where \(p^n\) here is \(3^{100}\).


The Less Stupid Method

Let’s try the second method (the less stupid). Instead of taking the equation module \(3^{100}\), we’ll instead start with the lowest power and work our way up. So

$$ \begin{align*} x^2 &\equiv 7 \pmod{3^1}, \\ x^2 &\equiv 7 \pmod{3^2}, \cdots \end{align*} $$

In particular, let’s start with the lowest power possible,

$$ \begin{align*} x^2 \equiv 7 \pmod{3} \end{align*} $$

We can solve this by trial and error since it’s a small power. The solution is \(x \equiv \pm 1 \pmod{3}\). But let’s focus on one of them, say \(x \equiv 1 \pmod{3}\). Next, we want to solve

$$ \begin{align*} x \equiv 1 \pmod{3^2} \end{align*} $$

We will “lift” the solution module \(3\) to the solution module \(9\). We know that \(x \equiv 1 \pmod{3}\) is a solution to the first equation. This means that \(x = 3k + 1\). So

$$ \begin{align*} x = 1,4,7 \pmod{9} \end{align*} $$

Let’s square each potential value of \(x\)

$$ \begin{align*} x^2 &= 1^2 \equiv 1 \pmod{9} \\ x^2 &= 4^2 \equiv 16 \equiv 7 \pmod{9} \\ x^2 &= 7^2 \equiv 49 \equiv 4 \pmod{9} \end{align*} $$

We can see here that \(x = 4\) works since \(4^2 \equiv 7 \pmod{3^2}\). This is also the only solution. We can now we repeat the same process to solve

$$ \begin{align*} x^2 \equiv 7 \pmod{3^3} \end{align*} $$

We know that \(x \equiv 4 \pmod{9}\) is a solution to the second congruence. We will lift this solution to the solution module \(27\). We know \(x = 9k + 4\). So we can try the values

$$ \begin{align*} x = 4,13,22 \pmod{27} \end{align*} $$

Let’s square each potential value

$$ \begin{align*} x^2 &= 4^2 \equiv 16 \pmod{27} \\ x^2 &= 13^2 \equiv 169 \equiv 7 \pmod{27} \\ x^2 &= (22)^2 \equiv 484 \equiv 25 \pmod{27} \end{align*} $$

So the solution is \(x \equiv 13 \pmod{27}\). We can keep going like this until we reach the solution to

$$ \begin{align*} x^2 \equiv 7 \pmod{3^{100}} \end{align*} $$

How long does this algorithm take? It will take \(O(pn)\) so much faster than \(O(p^n)\).


Hansel's Lemma

So notice how we started with a solution modulo \(p^k\) and then we lifted it to a solution modulo \(p^{k+1}\) to get a new solution.

The question now is can we always lift \(x_i\) to \(x_{i+1}\) and is this unique?

In the previous example, saw that we could lift every time and it was always unique. However, this is not always true. Let’s do another example where this might not be true. Consider

$$ \begin{align*} x^2 \equiv 17 \pmod{2^{10}} \end{align*} $$

We start with the lowest power:

$$ \begin{align*} x^2 &\equiv 17 \pmod{2^{1}} \\ x^2 &\equiv 1 \pmod{2} \end{align*} $$

Here the solution is \(x \equiv 1 \pmod{2}\) since \(-1 \equiv 1 \pmod{2}\). Next, we take the next power:

$$ \begin{align*} x^2 &\equiv 17 \pmod{2^{2}} \\ x^2 &\equiv 1 \pmod{4} \end{align*} $$

We have two solutions: \(x \equiv 1 \pmod{4}\) and \(x \equiv 3 \pmod{4}\). Next, we want to solve

$$ \begin{align*} x^2 &\equiv 17 \pmod{2^{3}} \\ x^2 &\equiv 1 \pmod{8} \end{align*} $$

Obviously the solutions are \(x \equiv 1,3,5,7 \pmod{8}\). But suppose we are lifting the solutions modulo \(4\) to the solutions modulo \(8\). Suppose that we started with \(x \equiv 1 \pmod{4}\), then \(x = 4k + 1\). So we can try

$$ \begin{align*} x^2 &= 1^2 \equiv 1 \pmod{8} \\ x^2 &= 5^2 \equiv 25 \equiv 1 \pmod{8} \end{align*} $$

We have two different solutions here (aka, it is not unique). Similarly, if \(x \equiv 3 \pmod{4}\), then \(x = 4k + 3\). So

$$ \begin{align*} x^2 &= 3^2 \equiv 1 \pmod{8} \\ x^2 &= 7^2 \equiv 49 \equiv 1 \pmod{8} \end{align*} $$

Again, the solution is not unique. It is either \(3\) or \(7\). So now the algorithm will have to try each solution when it’s lifted to the next level which will take even more time. That’s for uniqueness. Are we guaranteed a solution every time? suppose we continue to the next level

$$ \begin{align*} x^2 &\equiv 17 \pmod{2^{4}} \\ x^2 &\equiv 1 \pmod{16} \end{align*} $$

Let’s now try the lifted solution \(x \equiv 3 \pmod{8}\) that we got earlier. This means that \(x = 8k + 3\), so we want to try \(x = 3\) and \(x = 11\) but

$$ \begin{align*} x^2 &= 3^2 \equiv 9 \pmod{16} \\ x^2 &= 11^2 \equiv 121 \equiv 9 \pmod{16} \end{align*} $$

So neither solution works! So we don’t no solutions modulo \(16\). So now we have a tree that looks like this

[TODO]

Is there way to know if we can lift solutions early on? before we go and waste our time going all the way to say modulo \(2^{99}\) and discover that we have to stop? Yes. It is using Hensel’s Lemma.


Hansel's Lemma

Suppose we have solved the congruence:

$$ \begin{align*} f(x_1) \equiv 0 \pmod{p} \end{align*} $$

So \(x_1\) is a solution module \(p\). Now, we want to lift this solution to a solution module \(p^2\). That is, we are looking for a solution of the form

$$ \begin{align*} x = x_1 + ap \end{align*} $$

Such that

$$ \begin{align*} f(x_1 + ap) \equiv 0 \pmod{p^2} \end{align*} $$

Here \(a\) is the unknown integer we are trying to find. Let’s apply Taylor’s Theorem to see that

$$ \begin{align*} f(x_1 + ap) \equiv f(x_1) + apf'(x_1) + \frac{(ap)^2}{2!} f''(x_1) + \cdots \pmod{p^2} \end{align*} $$

But we’re working module \(p^2\) so all the terms after the linear term will vanish module \(p^2\) (IMPORTANT: there is a justification for the later terms like \(\frac{(ap)^2}{2!}\) being divisible by \(p^2\) and hence vanishing. I didn’t write notes on this. tldr is “the factorial at the bottom cancels out with the derivative on top and the whole fraction is actually an integer). So now we have

$$ \begin{align*} f(x_1 + ap) \equiv f(x_1) + apf'(x_1) \pmod{p^2} \end{align*} $$

We also already know that \(f(x_1) \equiv 0 \pmod{p}\) so \(f(x_1) = pk\) for some \(k\) and we can write

$$ \begin{align*} f(x_1 + ap) \equiv pk + apf'(x_1) \pmod{p^2} \end{align*} $$

Since we’re working module \(p^2\), we can divide by \(p\) to get

$$ \begin{align*} f(x_1 + ap) \equiv k + af'(x_1) \pmod{p} \end{align*} $$

Solving for \(a\), we get

$$ \begin{align*} a \equiv -\frac{k}{f'(x_1)} \pmod{p} \end{align*} $$

This is only solvable if \(f'(x_1) \not\equiv 0 \pmod{p}\) (so the inverse exists). In particular, Hensel’s Lemma says that

Let \( f(x) \in \mathbb{Z}[x] \), and let \( p \) be a prime. We can solve \[ f(x) \equiv 0 \pmod{p^n} \] if we can find a solution \( x_0 \) modulo \( p \) such that \[ f(x_0) \equiv 0 \pmod{p} \quad \text{and} \quad f'(x_0) \not\equiv 0 \pmod{p}. \] This solution can be lifted inductively: given a solution \( x_k \pmod{p^k} \), there exists a unique \( x_{k+1} \equiv x_k \pmod{p^k} \) such that \[ f(x_{k+1}) \equiv 0 \pmod{p^{k+1}}. \]

What does this mean? If we were to solve

$$ \begin{align*} x^2 &\equiv 7 \pmod{3^n} \end{align*} $$

Then, we start with \(n = 1\). We check if there is a solution to

$$ \begin{align*} x^2 \equiv 7 \equiv 1 \pmod{3} \end{align*} $$

We do in fact have the solutions \(x \equiv \pm 1 \pmod{3}\).
Now, pick \(x_0 = 1\) and define \(f(x) = x^2 - 7\). Then

$$ \begin{align*} f(1) &= 1^2 - 7 = -6 \equiv 0 \pmod{3} \\ f'(1) &= 2(1) = 2 \not\equiv 0 \pmod{3} \end{align*} $$

Since \(f(x_0) \equiv 0 \pmod{3}\) and \(f'(x_0) \not\equiv \pmod{3}\), Hensel’s lemma applies and guarantees that this solution \(x_0 = 1 \pmod{3}\) can be lifted to a solution module \(3^n\) for any \(n \geq 1\).


Observe now that when we tried solving

$$ \begin{align*} x^2 &\equiv 1 \pmod{16} \end{align*} $$

We attempted to lift the solution \(x \equiv 3 \pmod{8}\) to a solution module \(16\). We saw that neither solution worked since neither were equivalent to \(1\) module \(16\) in

$$ \begin{align*} x^2 &= 3^2 \equiv 9 \pmod{16} \\ x^2 &= 11^2 \equiv 121 \equiv 9 \pmod{16} \end{align*} $$

Now consider \(f(x) = x^2 - 1\). Then

$$ \begin{align*} f(3) = 3^2 - 1 = 9 \not\equiv 0 \pmod{8} \end{align*} $$

Here, Hensel’s Lemma can’t apply so we are not guaranteed that we can lift the solution \(x \equiv 3 \pmod{8}\) from module \(8\) to module \(16\).

Hensel’s Lemma takes around \(O(n)\) steps. So much less than \(O(pn)\).


Newton's Method

You have seen Newton’s Method before in Numerical Analysis or Calculus. It is an iterative algorithm for finding roots of real-valued functions.

$$ \begin{align*} f(x) = 0 \end{align*} $$

We start with a guess \(x_0\) and then and improve it using the tangent line at that point with

$$ \begin{align*} x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)} \end{align*} $$

We can analyze this new guess using Taylor’s Theorem in

$$ \begin{align*} f\left(x - \frac{f(x)}{f'(x)}\right) = f(x) + f'(x) \cdot \left(- \frac{f(x)}{f'(x)}\right) + \frac{f''(x)}{2!} \cdot \left(- \frac{f(x)}{f'(x)}\right)^2 + \cdots \end{align*} $$

We first note that the first two terms cancel with each other. The remaining terms are

$$ \begin{align*} f\left(x - \frac{f(x)}{f'(x)}\right) = \sum_{k=2}^{\infty} \frac{f^{(k)}(x)}{k!} \cdot \left(- \frac{f(x)}{f'(x)}\right)^k \end{align*} $$

So this is Newton’s Method. Suppose that we know that \(x\) is a solution module \(p^n\), i.e.,

$$ \begin{align*} f(x) \equiv 0 \pmod{p^n} \end{align*} $$

Then

$$ \begin{align*} f(x) = p^nk \quad \text{for some integer }k \end{align*} $$

and further suppose that \(f'(x) \not\equiv 0 \pmod{p}\).
Now suppose that we apply Newton’s Method above and consider the second term so

$$ \begin{align*} \frac{f''(x)}{2!} \cdot \left(- \frac{f(x)}{f'(x)}\right)^2 \end{align*} $$

The first term is an integer. The second term can be written as

$$ \begin{align*} \left(- \frac{f(x)}{f'(x)}\right)^2 = \frac{(p^nk)^2}{f'(x)^2} = \frac{p^{2n}k^2}{f'(x)^2} \end{align*} $$

So it is divisible by \(p^{2n}\) if \(f'(x) \not\equiv 0 \pmod{p}\) which we have assumed. So all the terms will vanish modulo \(p^n\). In general

$$ \begin{align*} f\left(x - \frac{f(x)}{f'(x)}\right) \equiv 0 \pmod{p^{2n}} \end{align*} $$

If \(f'(x) \not\equiv 0 \pmod{p}\). So the conclusion is the same. If \(f(x) \equiv 0 \pmod{p^n}\) and \(f'(x) \not\equiv 0 \pmod{p}\), then the Newton’s update

$$ \begin{align*} x - \frac{f(x)}{f'(x)} \end{align*} $$

is a solution modulo \(p^{2n}\). Note here unlike Hansel, we’re doubling the exponent and not just adding \(1\).


Example

TODO


References