Evaluation Basis

\gdef\vec#1{\bm{#1}} \gdef\delim#1#2#3{\mathopen{}\mathclose{\left#1 #2 \right#3}} \gdef\p#1{\delim({#1})} \gdef\Mod#1{\delim[{#1}]} \gdef\F{\mathbb{F}} \gdef\G{\mathbb{G}} \gdef\g{\mathrm{g}} \gdef\NTT{\mathsf{NTT}} \gdef\MSM{\mathsf{MSM}} \gdef\Mul{\mathsf{Mul}} \gdef\Mulc{\mathsf{Mul}^{\mathsf{C}}} \gdef\Add{\mathsf{Add}} \gdef\D#1{\mathrm{\partial}_{#1}} \gdef\Z{\mathrm{Z}} \gdef\T{\mathsf{T}} \gdef\I{\mathrm{I}} \gdef\J{\mathrm{J}} \gdef\ω{\mathrm{ω}} \gdef\diag{\operatorname{diag}} \gdef\set#1{\delim\{{#1}\}}

Given a finite field \F_q we can construct a polynomial ring \F[X].

Fermat's little theorem: X^q-X has as roots all of

https://en.wikipedia.org/wiki/Frobenius_endomorphism

If \F is a prime field or order p then \Z_{\F}(X) = X^p - X has as roots all elements of \F.

https://en.wikipedia.org/wiki/Fermat%27s_little_theorem

\Z_{\F}'(X) = p⋅X^{p-1} - 1 = -1

\delim\vert{\F_q[X]_{<n}}\vert = q^n

\F_q[X]_{<n} \sim \F_q^n

\delim\vert{\F_q → \F_q}\vert = q^q

Basis for \F_q[X].

E_a(X) = \frac{\Z_{\F \setminus \set{a}}(X)}{\Z_{\F}'(a)} = -\frac{X^p - X}{X - a}

Idea. Since X^{q-1} is 0 for X=0 and 1 otherwise we have E_0(X) = 1 - X^{q-1}. Furthermore we have E_a(X) = E_0(X-a). This gives

\begin{aligned} 1 - \p{X-a}^{q-1} &= -\frac{X^q - X}{X - a} \\ \p{X-a} - \p{X-a}^q &= -\p{X^q - X} \\ X-a - \p{X-a}^q &= -X^q + X \\ \p{X-a}^q &= X^q - a \\ \end{aligned}

Which is a basic consequence of the Frobenius endomorphism and Fermat's little theorem.

This creates a map from functions over \F to \F_q[X]_{<q}. Given f(x) the equivalent P(X) is

P(X) = \sum_{a∈\F} f(a) ⋅ E_a\p{X}

\Z_{\F}(X) = X^p - X

So we can not distinguish between \F_q[X] and \F_q[X]/\p{X^q-X} by evaluations. But what about the differential operator \D{X}? Can it somehow shift in the X^q and higher terms?

\D{X} X^q = q⋅X^{q-1}

and since q = 0 in \F_q this is also zero. So the differential operator also does not distinguish.

Question. How does this work in extension fields? We would already have p = 0 not q = p^n.

A derivative operator is a linear map that satisfies \D{X}\p{a⋅b} = \D{X}\p{a}⋅b + a⋅\D{X}\p{b}.

Question. Given that every function can be represented by a polynomial. How does the multiplicative inversion operator look?

Q(X) = \frac{1}{P(X)}

We can start simple by \frac 1X:

\begin{aligned} P(X) &= \sum_{a∈\F} f(a) ⋅ E_a\p{X} \\ &= - \sum_{a∈\F} \frac 1a ⋅\frac{X^p - X}{X - a} \\ \end{aligned}

Question. Or the applicative-inversion operator.

Q(X) = P^{-1}(X)

Monomial basis

Given a field \F and a polynomial ring \F[X] over it. A n-term polynomials P∈\F[X] can be written in monomial form

P(X) = c_0 + c_1 ⋅ X + c_2 ⋅ X^2 + ⋯ + c_{n-1} ⋅ X^{n-1}

where \vec c ∈ \F^n is the vector of coefficients. This is a basis in the linear algebra sense as we can construct a vector \vec m ∈ \F[X]^n such that P is the inner product \vec c ⋅ \vec m. Such a \vec m is called a standard basis.

P(X) = \begin{bmatrix} c_0 \\ c_1 \\ c_2 \\ ⋮ \\ c_{n-1} \end{bmatrix} ⋅ \begin{bmatrix} 1 \\ X \\ X^2 \\ ⋮ \\ X^{n-1} \end{bmatrix}

Zero polynomial

Polynomials that can be factored into a constant linear factors are called splitting.

P(X) = a⋅(X - x_0)⋅(X - x_1)⋅⋯⋅(X - x_{n-1})

the set \mathcal S \subset \F are called roots as P(x) = 0 iff x ∈ \mathcal S. Define the zero polynomial over a set \mathcal S as

\Z_{\mathcal S}(X) = \prod_{x∈\mathcal X} \p{X - x}

this is the minimal polynomial such that

\forall_{x ∈ \mathcal S}\quad \Z_{\mathcal S}(x) = 0

the roots allow for some set theoretic operations (todo. generalize to multisets)

\begin{aligned} \Z_{\mathcal A \cup \mathcal B}(X) &= \frac{\Z_{\mathcal A}(X)⋅\Z_{\mathcal B}(X)}{\Z_{\mathcal A ∩ \mathcal B}(X)} & \Z_{\mathcal A \setminus \mathcal B}(X) &= \frac{\Z_{\mathcal A}(X)}{\Z_{\mathcal A ∩ \mathcal B}(X)} \\ \end{aligned}

in particular for x ∈ \mathcal S:

\Z_{\mathcal S \setminus \set{x}}(X) = \frac{\Z_{\mathcal S}(X)}{X - x}

Derivative

The derivative of \Z_{\mathcal S} can be found using the product rule or the logarithmic derivative

\begin{aligned} \Z_{\mathcal S}'(X) &= \sum_{x∈\mathcal S} \Z_{\mathcal S \setminus \set{x}}(X) = \sum_{x∈\mathcal S} \frac{\Z_{\mathcal S}(X)}{X - x} = \Z_{\mathcal S}(X) ⋅ \sum_{x∈\mathcal S} \frac{1}{X - x} \end{aligned}

The first form is suitable to evaluate on the domain \mathcal S, which has an elegant result:

\begin{aligned} \forall_{x ∈ \mathcal S}\quad \Z_{\mathcal S}'(x) = \Z_{\mathcal S \setminus \set{x}}(x) \end{aligned}

Specific values

If \mathcal S are the n-th roots of unity \mathcal Ω_n = \set{1, ω_n, ω_n^2, …, ω_n^{n-1}} the zero polynomial has a sparse monomial form:

\begin{aligned} \Z_{\mathcal Ω_n}(X) &= X^n - 1 &\quad \Z_{\mathcal Ω_n}'(X) &= n⋅X^{n-1} \end{aligned}

If \mathcal S is the sequence [0,n) = \set{0,1,2,…,n-1} the zero polynomial is the falling factorial. For (-n, 0] it is the rising factorial:

\begin{aligned} \Z_{[0,n)}(X) &= X^{\underline{n}} &\quad \Z_{(-n,0]}(X) &= X^{\overline{n}} & \end{aligned}

These are related to the factorial and gamma function

\begin{aligned} \Z_{[0,n)}(X) &= \frac{\Gamma\p{X + 1}}{\Gamma\p{X - n + 1}} &\quad \Z_{(-n,0]}(X) &= \frac{\Gamma\p{X + n}}{\Gamma\p{X}} \end{aligned}

In general they have lots of uses in combinatorics and umbral calculus (todo explore)

Evaluation basis

Given an evaluation domain \vec x ∈ \F^n containing n distinct elements, we can also represent P uniquely by its evaluations \vec y ∈ \F^n on this domain y_i = P(x_i).

We can again construct a standard basis \vec L ∈ \F[X]^n, in this case called a Lagrange basis

L_i(X) = \frac{\Z_{\vec x \setminus \set{x_i}}(X)}{\Z_{\vec x \setminus \set{x_i}}(x_i)} = \frac{\Z_{\vec x \setminus \set{x_i}}(X)}{\Z_{\vec x}'(x_i)}

these satisfy L_i(x_j) = δ_{ij}.

If we define normalization constants l_i = \p{Z_{\vec x \setminus \set{x_i}}(x_i)}^{-1} then we get the barycentric form:

L_i(X) = \Z_{\vec x}(X)⋅\frac{l_i}{X - x_i}

this form is useful for evaluating the basis in x∉\vec x as Z_{\vec x}(x) can be computed once. The fraction is ill-defined on \vec x though, so this form is more appropriate

L_i(X) = l_i⋅\frac{\Z_{\vec x}(X)}{X - x_i}

Derivative

Say we have P(X) ∈ \F[X] represented in evaluations basis on domain \vec x as \vec p and we wish to know its derivative P'(X) in the same basis.

\begin{aligned} P(X) &= \sum_{i∈[0,n)} p_i ⋅ L_i(x) &\quad P'(X) &= \sum_{i∈[0,n)} p_i ⋅ L_i'(x) \end{aligned}

so to compute P'(X) in the evaluation basis we need to know L_i'(x_j). Start with computing the derivate

L_i'(X) = \sum_{x∈\mathcal S \setminus \set{x_i}} \frac{L_i(X)}{X - x}

L_i'(X) = \frac{\Z_{\vec x \setminus \set{x_i}}'(X)}{\Z_{\vec x \setminus \set{x_i}}(x_i)}

\begin{aligned} L_i'(X) &= l_i⋅ \p{\frac{Z_{\vec x}'(X)}{X - x_i} - \frac{Z_{\vec x}(X)}{\p{X - x_i}^2}} \\ &= l_i⋅ \p{ \frac{\sum_{x∈\mathcal S} \frac{\Z_{\mathcal S}(X)}{X - x}}{X - x_i} - \frac{Z_{\vec x}(X)}{\p{X - x_i}^2}} \\ &= l_i⋅ \p{ \sum_{x∈\mathcal S} \frac{\Z_{\mathcal S}(X)}{\p{X - x}⋅\p{X - x_i}} - \frac{Z_{\vec x}(X)}{\p{X - x_i}^2} } \\ &= l_i⋅\sum_{x∈\mathcal S \setminus \set{x_i}} \frac{\Z_{\mathcal S}(X)}{\p{X - x}⋅\p{X - x_i}} \\ &= l_i ⋅ \frac{\Z_{\mathcal S}(X)}{X - x_i}⋅\sum_{x∈\mathcal S \setminus \set{x_i}} \frac{1}{X - x} \\ &= l_i ⋅ \frac{\Z_{\mathcal S}(X)}{X - x_i}⋅\sum_{x∈\mathcal S \setminus \set{x_i}} \frac{1}{X - x} \\ &= L_i(X) ⋅ \sum_{x∈\mathcal S \setminus \set{x_i}} \frac{1}{X - x} \\ &= \sum_{x∈\mathcal S \setminus \set{x_i}} \frac{L_i(X)}{X - x} \\ \end{aligned}

The on-diagonal elements L_i'(x_i) are

L_i'(x_i) = \frac{\Z_{\vec x \setminus \set{x_i}}'(x_i)}{\Z_{\vec x \setminus \set{x_i}}(x_i)} = \sum_{x ∈ \vec x \setminus \set{x_i}} \frac{1}{x_j - x}

The off-diagonal elements L_i'(x_j) are

L_i'(x_j) = \frac{\Z_{\vec x \setminus \set{x_i}}'(x_j)}{\Z_{\vec x \setminus \set{x_i}}(x_i)} = \frac{\Z_{\vec x \setminus \set{x_i,x_j}}(x_j)}{\Z_{\vec x \setminus \set{x_i}}(x_i)} = \frac{1}{x_j - x_i}⋅ \frac{\Z_{\vec x \setminus \set{x_j}}(x_j)}{\Z_{\vec x \setminus \set{x_i}}(x_i)} = \frac{1}{x_j - x_i}⋅ \frac{\Z_{\vec x}'(x_j)}{\Z_{\vec x}'(x_i)}

Todo. What is the rank of this matrix?

Roots of unity

On-diagonal

L_i'(\ω^i) = \frac{\Z_{Ω_n \setminus \set{\ω^i}}'(\ω^i)}{\Z_{Ω_n \setminus \set{\ω^i}}(\ω^i)} = \frac{\Z_{Ω_n \setminus \set{\ω^i}}'(\ω^i)}{\Z_{Ω_n}'(\ω^i)} = \frac{? }{n⋅\ω^{-i}}

\Z_{\mathcal A \setminus \mathcal B}'(X) = \D{X} \p{\frac{\Z_{\mathcal A}(X)}{\Z_{\mathcal A ∩ \mathcal B}(X)}} = \frac{\Z_{\mathcal A}'(X)⋅\Z_{\mathcal A ∩ \mathcal B}(X) - \Z_{\mathcal A}(X)⋅\Z_{\mathcal A ∩ \mathcal B}'(X)}{\p{\Z_{\mathcal A ∩ \mathcal B}(X)}^2}

$$ \begin{aligned} \Z_{Ω_n \setminus \set{\ω^i}}'(X) &= \frac{\Z_{Ω_n}'(X)⋅\Z_{\set{\ω^i}}(X) - \Z_{Ω_n}(X)⋅\Z_{\set{\ω^i}}'(X)}{\p{\Z_{\set{\ω^i}}(X)}^2} \ &= \frac{\p{n⋅X^{n -1}}⋅\p{X - \ω^i} - \p{X^n-1}}{\p{X - \ω^i}^2} \ &= \frac{n⋅X^{n -1}}{X - \ω^i}

Off-diagonal

L_i'(\ω^j) = \frac{1}{\ω^j - \ω^i}⋅ \frac{\Z_{Ω_n}'(\ω^j)}{\Z_{Ω_n}'(\ω^i)} = \frac{\ω^{i -j}}{\ω^j - \ω^i}

This is a Cauchy-like matrix

to do. Test these results numerically.

https://www.ams.org/journals/mcom/1988-50-181/S0025-5718-1988-0917825-9/S0025-5718-1988-0917825-9.pdf

https://link.springer.com/article/10.1007/s11075-022-01391-y

https://arxiv.org/pdf/1506.02285.pdf

CRT Basis

E_i(X) = \delim[{\frac{m_1(X)}{M(X)}}]_{m_1}⋅\frac{M(X)}{m_1(X)}

NTT Solution

Pick \vec x roots of unity and use \NTT_n to convert to monomial form. Convert back to a coset \vec y using \NTT_n. Square all values 2⋅n⋅\Mul_{\F}, convert to monomial form \NTT_{2⋅n}. We now have coefficients of P^2(X).

Basis transformations

Can every polynomial basis be written as a combination of point and derivative?

The conversion between monomial and roots-of-unity is the \NTT.

Multiplication by a constant polynomial

Division by a constant polynomial

Modulo operations

Derivatives

The formal derivative is a linear operator.

\D{X} P(X)

\D{X}\p{a_0 + a_1 ⋅X + a_2 ⋅ X^2 + ⋯} = a_1 + 2 ⋅ a_2 ⋅ X + 3⋅a_3⋅X^3 ⋯

Given a basis, linear operators can be expressed as a matrix. The monomial basis almost diagonalizes the derivative operator:

\begin{bmatrix} 0 & 1 & 0 & 0 \\ 0 & 0 & 2 & 0 \\ 0 & 0 & 0 & 3 \\ 0 & 0 & 0 & 0 \\ \end{bmatrix}

Q: Which basis diagonalizes it? Is there always such a basis?

https://en.wikipedia.org/wiki/Multiplier_(Fourier_analysis)#On_the_unit_circle

Multiplications

P(c ⋅ X)

a_0 + a_1⋅(c⋅ X) + a_2 ⋅ (c ⋅ X)^2 + ⋯ = a_0 + a_1 ⋅ c ⋅ X + a_2 ⋅ c^2 ⋅ X^2 + ⋯

\begin{bmatrix} 1 & 0 & 0 & 0 \\ 0 & c & 0 & 0 \\ 0 & 0 & c^2 & 0 \\ 0 & 0 & 0 & c^3 \\ \end{bmatrix}

Translations

P(X+δ)

a_0 + a_1⋅(X + δ) + a_2 ⋅ (X + δ)^2 + ⋯ = \p{a_0 + a_1⋅δ + a_2 ⋅δ^2 + ⋯} + \p{a_1 + 2⋅a_2⋅δ + ⋯}⋅X + \p{a_2 + ⋯} ⋅ X^2 + ⋯

(X + δ)^2 = X^2 + 2⋅δ⋅X + δ^2

Chinese Remainder Theorem

A.B mod X-1

https://en.wikipedia.org/wiki/Companion_matrix#Diagonalizability

HSS Matrices: https://mipals.github.io/pubs/matrix/hss/

Coding Theory

Reed-Solomon codes. Linear block codes that have maximal separation.

Vandermonde matrix V:

\begin{bmatrix} I & V \end{bmatrix}

Cauchy matrix C

\begin{bmatrix} I & C \end{bmatrix}

https://web.eecs.utk.edu/~jplank/plank/papers/CS-05-569.pdf

List decoding

Linear Algebra

A vector space \F^n and a matrix space \F^{n×m}.

Linear maps \F^n ⇄ \F^{n×1}, \F^n ⇄ \F^{1×n}, \F^n → \F^{n×1}. Given \vec a ∈ \F^n.

row-vector, column-vector, diagonal matrix.

\begin{aligned} \begin{bmatrix} a_0 \\ a_1 \\ ⋮ \\ a_{n-1} \end{bmatrix}, \begin{bmatrix} a_0 & a_1 & ⋯ & a_{n-1} \end{bmatrix}, \begin{bmatrix} a_0 & 0 & ⋯ & 0 \\ 0 & a_1 & ⋯ & 0 \\ ⋮ & ⋮ & ⋱ & ⋮ \\ 0 & 0 & ⋯ & a_{n-1} \end{bmatrix} \end{aligned}

row-major order, column-mjor order, z-order

https://en.wikipedia.org/wiki/Row-_and_column-major_order

\F^{n⋅m} ⇄ \F^{n×m}

\begin{aligned} \begin{bmatrix} a_0 & a_1 & ⋯ & a_{n-1} \\ a_n & a_{n+1} & ⋯ & a_{2⋅n - 1} \\ ⋮ & ⋮ & ⋱ & ⋮ \\ a_{(m-1)⋅n} & a_{(m-1)⋅n + 1} & ⋯ & a_{n⋅m-1} \end{bmatrix}, \begin{bmatrix} a_0 & a_n & ⋯ & a_{(m-1)⋅n-1} \\ a_1 & a_{n+1} & ⋯ & a_{(m-1)⋅n - 1} \\ ⋮ & ⋮ & ⋱ & ⋮ \\ a_{n-1} & a_{2⋅n - 1} & ⋯ & a_{n⋅m-1} \end{bmatrix} \end{aligned}

https://en.wikipedia.org/wiki/Z-order_curve

CRT order. As in Good-Thomas FFT.

n → (\Mod{n}_{m_1},\Mod{n}_{m_2})

Permutation Matrix

Rader's permutation https://en.wikipedia.org/wiki/Rader%27s_FFT_algorithm

The multiplicative group is cyclic

\F_p^× ≅ C_{p-1}

Remco Bloemen
Math & Engineering
https://2π.com