Chapter 6: Proof of Main Theorem

We now assemble the pieces to prove Theorem~thm:main.

Chebyshev roots and their power sums

To prove that \(S_N(x; \theta) = T_N(x) + c(\theta)\), we must show that the roots \(r_k(\theta)\) and the Chebyshev roots \(\xi_k\) have matching power sums. We already know the rotated roots have \(\theta\)-independent power sums. Now we compute the power sums of the Chebyshev roots explicitly.

Lemma

The roots of the Chebyshev polynomial \(T_N(x)\) are

\begin{equation}...\end{equation}

These are \(N\) distinct real numbers in the interval \((-1, 1)\). chebyshevRoot_in_Icc, chebyshevRoots_distinct

Proof.

By the defining relation \(T_N(\cos\theta) = \cos(N\theta)\), we have \(T_N(\xi_k) = 0\) if and only if \(N\theta_k = (2k + 1)\pi/2\) for some integer \(k\), i.e., \(\theta_k = (2k + 1)\pi/(2N)\). Since \(\theta_k \in (0, \pi)\) for \(k \in \{0, \ldots, N-1\}\), the values \(\xi_k = \cos\theta_k\) are distinct and lie in \((-1, 1)\). chebyshev_T_eval_chebyshevRoot, chebyshev_T_eval_eq_zero_iff

Computational examples: Chebyshev roots

Example

The roots of \(T_3(x) = 4x^3 - 3x\) are:

\begin{align*}...\end{align*}

Power sums:

\begin{align*}...\end{align*}

Note that \(p_2 = \frac{3}{2} = \frac{N}{2^j}\binom{j}{j/2} = \frac{3}{4} \cdot 2 = \frac{3}{2}\) when \(j = 2\).

Example

The roots of \(T_4(x) = 8x^4 - 8x^2 + 1\) are:

\begin{align*}...\end{align*}

Power sums:

\begin{align*}...\end{align*}
Example

The roots of \(T_5(x) = 16x^5 - 20x^3 + 5x\) are:

\begin{align*}...\end{align*}

Power sums: \(p_1 = 0\), \(p_2 = 5/2\), \(p_3 = 0\), \(p_4 = 15/8\).

Power sum equality

Theorem 6.2.1

For $1 \leq j < N$, $\sum_{k=0}^{N-1} \cos^j\!\left(\frac{(2k+1)\pi}{2N}\right) = \begin{cases} N \cdot 2^{-j} \binom{j}{j/2} & j \text{ even} \\ 0 & j \text{ odd} \end{cases}$. The proof parallels the rotated roots case but uses Chebyshev-specific orthogonality (vanishing of cosine sums at Chebyshev angles) instead of primitive root orthogonality.

theorem chebyshev_roots_powersum_value (N j : ) (hN : 0 < N) (hj : 0 < j) (hj' : j < N) :
     k  Finset.range N, (Real.cos ((2 * k + 1 : ) * π / (2 * N))) ^ j =
    if Even j then N * (2 : ) ^ (-(j : )) * (j.choose (j / 2)) else 0 := by
Proof

Apply the Chebyshev binomial expansion, then invoke $\texttt{binomial\_terms\_vanish\_chebyshev}$ for all terms with $(2r-j) \neq 0$. Only the middle term $r = j/2$ survives when $j$ is even.


  -- Apply binomial expansion from ChebyshevOrthogonality.lean
  have h_bound : j < 2 * N := by omega
  rw [ChebyshevCircles.sum_cos_pow_chebyshev_binomial N j hN hj h_bound]

  -- All non-constant terms vanish by binomial_terms_vanish_chebyshev
  by_cases h_even : Even j
  · -- j is even: only the middle term (r = j/2) contributes
    rw [if_pos h_even]
    obtain m, hm := h_even

    -- Show that only the term where r = j/2 is non-zero
    have sum_eq :  r  Finset.range (j + 1), (j.choose r : ) *
         k  Finset.range N, Real.cos ((2 * r - j : ) * (2 * k + 1 : ) * π / (2 * N)) =
      (j.choose (j / 2) : ) * N := by
      -- Apply binomial_terms_vanish_chebyshev to show all terms except r = j/2 vanish
      have hj_2_dvd : 2  j := by
        use m
        rw [hm]
        ring
      have hj_div : j / 2 = m := by omega

      have :  r  Finset.range (j + 1), (j.choose r : ) *
           k  Finset.range N,
            Real.cos ((2 * r - j : ) * (2 * k + 1 : ) * π / (2 * N)) =
         r  Finset.range (j + 1),
          if r = j / 2 then (j.choose r : ) * N else 0 := by
        apply Finset.sum_congr rfl
        intro r hr
        by_cases h_mid : r = j / 2
        · -- r = j/2: cos(0 * ...) = 1, sum = N
          rw [if_pos h_mid, h_mid]
          have h_zero : (2 * (j / 2) - j : ) = 0 := by
            -- We have j = m + m, and j / 2 = m
            have h1 : (j / 2 : ) = (m : ) := by exact_mod_cast hj_div
            have h2 : (j : ) = (m : ) + (m : ) := by omega
            rw [h1, h2]; ring
          simp [h_zero]
        · -- r ≠ j/2: apply binomial_terms_vanish_chebyshev
          rw [if_neg h_mid]
          have h_neq : (2 * r - j : )  0 := by
            intro h_eq
            have h_eq_nat : 2 * r = j := by omega
            have : r = j / 2 := by
              have : j = 2 * r := by omega
              rw [this, Nat.mul_div_cancel_left r (by omega : 0 < 2)]
            contradiction
          have h_vanish := ChebyshevCircles.binomial_terms_vanish_chebyshev N j r hN hj' hr h_neq
          rw [h_vanish]
          ring
      rw [this]
      simp [Finset.sum_ite_eq', Finset.mem_range]
      omega
    rw [sum_eq]
    ring
  · -- j is odd: all terms vanish
    rw [if_neg h_even]
    have sum_zero :  r  Finset.range (j + 1), (j.choose r : ) *
         k  Finset.range N,
          Real.cos ((2 * r - j : ) * (2 * k + 1 : ) * π / (2 * N)) = 0 := by
      -- All terms vanish: (2r - j) is odd when j is odd, apply binomial_terms_vanish_chebyshev
      apply Finset.sum_eq_zero
      intro r hr
      -- Show (2*r - j) ≠ 0
      have h_neq : (2 * r - j : )  0 := by
        intro h_eq
        have h_eq_nat : 2 * r = j := by omega
        -- This contradicts j being odd (2*r is even)
        have : Even j := r, by omega
        contradiction
      -- Apply binomial_terms_vanish_chebyshev
      have h_vanish := ChebyshevCircles.binomial_terms_vanish_chebyshev N j r hN hj' hr h_neq
      rw [h_vanish]
      ring
    rw [sum_zero]
    ring

This theorem is the key algebraic identity connecting the two root systems. Both power sums reduce to the same closed-form expression via independent binomial expansion arguments, one using primitive root orthogonality and the other using Chebyshev angle orthogonality. The matching power sums, combined with Newton's identities, force the polynomial coefficients to agree.
Theorem 6.2.2

For $1 \leq j < N$, $\sum_{k=0}^{N-1} \cos^j\!\left(\theta + \frac{2\pi k}{N}\right) = \sum_{k=0}^{N-1} \cos^j\!\left(\frac{(2k+1)\pi}{2N}\right)$.

theorem general_powersum_equality (N j : ) (hN : 0 < N) (hj : 0 < j) (hj' : j < N) :
     k  Finset.range N, (Real.cos (2 * π * k / N)) ^ j =
     k  Finset.range N, (Real.cos ((2 * k + 1 : ) * π / (2 * N))) ^ j := by
Proof

Both sums are evaluated in closed form by the rotated roots and Chebyshev roots power sum value lemmas. The closed forms are identical, establishing equality.


  -- Both sums equal the same closed-form expression
  rw [rotated_roots_powersum_value N j hN hj hj']
  rw [chebyshev_roots_powersum_value N j hN hj hj']

By Newton's identities, matching power sums for $1 \leq j < N$ imply matching elementary symmetric polynomials, and hence matching polynomial coefficients for degrees $1$ through $N$. This is how the main theorem follows.

Completion of the proof

We now have all the ingredients to prove the main theorem.

Proof.

Let \(r_k(\theta) = \cos(\theta + 2\pi k/N)\) for \(k = 0, \ldots, N-1\) be the roots of \(P_N(x; \theta)\), and let \(\xi_k = \cos((2k + 1)\pi/(2N))\) be the roots of \(T_N(x)\). realProjectionsList, chebyshevRootsList

By Theorem~thm:power_sum_equality, the power sums of \(\{r_k(\theta)\}\) and \(\{\xi_k\}\) are equal for all \(j\) with \(1 \leq j < N\):

\begin{equation}...\end{equation}

By Newton's identities (Theorem~thm:newton_identities), the elementary symmetric polynomials \(e_k\) for \(k = 1, \ldots, N\) are determined recursively from the power sums \(p_1, \ldots, p_k\). Since the power sums match for \(j < N\), and the recursion for \(e_k\) with \(k \leq N\) only depends on \(p_1, \ldots, p_k < p_N\), the elementary symmetric polynomials match:

\begin{equation}...\end{equation}

esymm_rotated_roots_invariant

The monic polynomial \(P_N(x; \theta)\) has roots \(r_k(\theta)\) and can be written as

\begin{equation}...\end{equation}

where \(e_k = e_k(r_0(\theta), \ldots, r_{N-1}(\theta))\) are the elementary symmetric polynomials. polynomialFromRealRoots, unscaledPolynomial_monic

Similarly, the Chebyshev polynomial \(T_N(x)\) has leading coefficient \(2^{N-1}\) and roots \(\xi_k\), so

\begin{equation}...\end{equation}

chebyshev_T_leadingCoeff

By Vieta's formulas, the coefficient of \(x^{N-k}\) in \(P_N(x; \theta)\) is \((-1)^k e_k(r_0, \ldots, r_{N-1})\), and the coefficient of \(x^{N-k}\) in \(T_N(x)/2^{N-1}\) is \((-1)^k e_k(\xi_0, \ldots, \xi_{N-1})\). Multiset.prod_X_sub_C_coeff

Since the elementary symmetric polynomials match for \(k = 1, \ldots, N\), we have \(P_N(x; \theta)\) and \(T_N(x)/2^{N-1}\) have the same coefficients for all degrees \(k = 1, \ldots, N\). They may differ only in the constant term (degree 0).

Multiplying both by \(2^{N-1}\), the scaled polynomial

\begin{equation}...\end{equation}

and \(T_N(x)\) have the same coefficients for degrees \(k = 1, \ldots, N\). They may differ in the constant term. scaledPolynomial_leadingCoeff, scaledPolynomial_degree

Hence

\begin{equation}...\end{equation}

where \(c(\theta)\) is given explicitly by eq:explicit_constant as the difference in constant terms. mainTheorem

Moreover, since the coefficients of \(x^k\) for \(k \geq 1\) match exactly, we have coefficient-wise equality:

\begin{equation}...\end{equation}

scaledPolynomial_matches_chebyshev_at_zero This completes the proof of Theorem~thm:main.

Explicit verification for small N

To build intuition, we verify the theorem explicitly for small values of \(N\).

Example

The rotated roots at \(\theta = 0\) are \(r_k = \cos(2\pi k/3)\) for \(k = 0, 1, 2\):

\begin{equation}...\end{equation}

The unscaled polynomial is

\begin{equation}...\end{equation}

Scaling by \(2^{N-1} = 4\) gives

\begin{equation}...\end{equation}

This differs from \(T_3(x) = 4x^3 - 3x\) by the constant \(c(0) = -1\), as predicted.

Example

The rotated roots at \(\theta = 0\) are \(r_k = \cos(\pi k/2)\) for \(k = 0, 1, 2, 3\):

\begin{equation}...\end{equation}

The unscaled polynomial is

\begin{equation}...\end{equation}

Scaling by \(2^3 = 8\) gives

\begin{equation}...\end{equation}

This differs from \(T_4(x) = 8x^4 - 8x^2 + 1\) by the constant \(c(0) = -1\).

6.1 Chebyshev roots and their power sums

To prove that \(S_N(x; \theta) = T_N(x) + c(\theta)\), we must show that the roots \(r_k(\theta)\) and the Chebyshev roots \(\xi_k\) have matching power sums. We already know the rotated roots have \(\theta\)-independent power sums. Now we compute the power sums of the Chebyshev roots explicitly.

Lemma

The roots of the Chebyshev polynomial \(T_N(x)\) are

\begin{equation}...\end{equation}

These are \(N\) distinct real numbers in the interval \((-1, 1)\). chebyshevRoot_in_Icc, chebyshevRoots_distinct

Proof.

By the defining relation \(T_N(\cos\theta) = \cos(N\theta)\), we have \(T_N(\xi_k) = 0\) if and only if \(N\theta_k = (2k + 1)\pi/2\) for some integer \(k\), i.e., \(\theta_k = (2k + 1)\pi/(2N)\). Since \(\theta_k \in (0, \pi)\) for \(k \in \{0, \ldots, N-1\}\), the values \(\xi_k = \cos\theta_k\) are distinct and lie in \((-1, 1)\). chebyshev_T_eval_chebyshevRoot, chebyshev_T_eval_eq_zero_iff

Computational examples: Chebyshev roots

Example

The roots of \(T_3(x) = 4x^3 - 3x\) are:

\begin{align*}...\end{align*}

Power sums:

\begin{align*}...\end{align*}

Note that \(p_2 = \frac{3}{2} = \frac{N}{2^j}\binom{j}{j/2} = \frac{3}{4} \cdot 2 = \frac{3}{2}\) when \(j = 2\).

Example

The roots of \(T_4(x) = 8x^4 - 8x^2 + 1\) are:

\begin{align*}...\end{align*}

Power sums:

\begin{align*}...\end{align*}
Example

The roots of \(T_5(x) = 16x^5 - 20x^3 + 5x\) are:

\begin{align*}...\end{align*}

Power sums: \(p_1 = 0\), \(p_2 = 5/2\), \(p_3 = 0\), \(p_4 = 15/8\).

6.2 Power sum equality

Theorem 6.2.1

For $1 \leq j < N$, $\sum_{k=0}^{N-1} \cos^j\!\left(\frac{(2k+1)\pi}{2N}\right) = \begin{cases} N \cdot 2^{-j} \binom{j}{j/2} & j \text{ even} \\ 0 & j \text{ odd} \end{cases}$. The proof parallels the rotated roots case but uses Chebyshev-specific orthogonality (vanishing of cosine sums at Chebyshev angles) instead of primitive root orthogonality.

theorem chebyshev_roots_powersum_value (N j : ) (hN : 0 < N) (hj : 0 < j) (hj' : j < N) :
     k  Finset.range N, (Real.cos ((2 * k + 1 : ) * π / (2 * N))) ^ j =
    if Even j then N * (2 : ) ^ (-(j : )) * (j.choose (j / 2)) else 0 := by
Proof

Apply the Chebyshev binomial expansion, then invoke $\texttt{binomial\_terms\_vanish\_chebyshev}$ for all terms with $(2r-j) \neq 0$. Only the middle term $r = j/2$ survives when $j$ is even.


  -- Apply binomial expansion from ChebyshevOrthogonality.lean
  have h_bound : j < 2 * N := by omega
  rw [ChebyshevCircles.sum_cos_pow_chebyshev_binomial N j hN hj h_bound]

  -- All non-constant terms vanish by binomial_terms_vanish_chebyshev
  by_cases h_even : Even j
  · -- j is even: only the middle term (r = j/2) contributes
    rw [if_pos h_even]
    obtain m, hm := h_even

    -- Show that only the term where r = j/2 is non-zero
    have sum_eq :  r  Finset.range (j + 1), (j.choose r : ) *
         k  Finset.range N, Real.cos ((2 * r - j : ) * (2 * k + 1 : ) * π / (2 * N)) =
      (j.choose (j / 2) : ) * N := by
      -- Apply binomial_terms_vanish_chebyshev to show all terms except r = j/2 vanish
      have hj_2_dvd : 2  j := by
        use m
        rw [hm]
        ring
      have hj_div : j / 2 = m := by omega

      have :  r  Finset.range (j + 1), (j.choose r : ) *
           k  Finset.range N,
            Real.cos ((2 * r - j : ) * (2 * k + 1 : ) * π / (2 * N)) =
         r  Finset.range (j + 1),
          if r = j / 2 then (j.choose r : ) * N else 0 := by
        apply Finset.sum_congr rfl
        intro r hr
        by_cases h_mid : r = j / 2
        · -- r = j/2: cos(0 * ...) = 1, sum = N
          rw [if_pos h_mid, h_mid]
          have h_zero : (2 * (j / 2) - j : ) = 0 := by
            -- We have j = m + m, and j / 2 = m
            have h1 : (j / 2 : ) = (m : ) := by exact_mod_cast hj_div
            have h2 : (j : ) = (m : ) + (m : ) := by omega
            rw [h1, h2]; ring
          simp [h_zero]
        · -- r ≠ j/2: apply binomial_terms_vanish_chebyshev
          rw [if_neg h_mid]
          have h_neq : (2 * r - j : )  0 := by
            intro h_eq
            have h_eq_nat : 2 * r = j := by omega
            have : r = j / 2 := by
              have : j = 2 * r := by omega
              rw [this, Nat.mul_div_cancel_left r (by omega : 0 < 2)]
            contradiction
          have h_vanish := ChebyshevCircles.binomial_terms_vanish_chebyshev N j r hN hj' hr h_neq
          rw [h_vanish]
          ring
      rw [this]
      simp [Finset.sum_ite_eq', Finset.mem_range]
      omega
    rw [sum_eq]
    ring
  · -- j is odd: all terms vanish
    rw [if_neg h_even]
    have sum_zero :  r  Finset.range (j + 1), (j.choose r : ) *
         k  Finset.range N,
          Real.cos ((2 * r - j : ) * (2 * k + 1 : ) * π / (2 * N)) = 0 := by
      -- All terms vanish: (2r - j) is odd when j is odd, apply binomial_terms_vanish_chebyshev
      apply Finset.sum_eq_zero
      intro r hr
      -- Show (2*r - j) ≠ 0
      have h_neq : (2 * r - j : )  0 := by
        intro h_eq
        have h_eq_nat : 2 * r = j := by omega
        -- This contradicts j being odd (2*r is even)
        have : Even j := r, by omega
        contradiction
      -- Apply binomial_terms_vanish_chebyshev
      have h_vanish := ChebyshevCircles.binomial_terms_vanish_chebyshev N j r hN hj' hr h_neq
      rw [h_vanish]
      ring
    rw [sum_zero]
    ring

This theorem is the key algebraic identity connecting the two root systems. Both power sums reduce to the same closed-form expression via independent binomial expansion arguments, one using primitive root orthogonality and the other using Chebyshev angle orthogonality. The matching power sums, combined with Newton's identities, force the polynomial coefficients to agree.
Theorem 6.2.2

For $1 \leq j < N$, $\sum_{k=0}^{N-1} \cos^j\!\left(\theta + \frac{2\pi k}{N}\right) = \sum_{k=0}^{N-1} \cos^j\!\left(\frac{(2k+1)\pi}{2N}\right)$.

theorem general_powersum_equality (N j : ) (hN : 0 < N) (hj : 0 < j) (hj' : j < N) :
     k  Finset.range N, (Real.cos (2 * π * k / N)) ^ j =
     k  Finset.range N, (Real.cos ((2 * k + 1 : ) * π / (2 * N))) ^ j := by
Proof

Both sums are evaluated in closed form by the rotated roots and Chebyshev roots power sum value lemmas. The closed forms are identical, establishing equality.


  -- Both sums equal the same closed-form expression
  rw [rotated_roots_powersum_value N j hN hj hj']
  rw [chebyshev_roots_powersum_value N j hN hj hj']

By Newton's identities, matching power sums for $1 \leq j < N$ imply matching elementary symmetric polynomials, and hence matching polynomial coefficients for degrees $1$ through $N$. This is how the main theorem follows.

6.3 Completion of the proof

We now have all the ingredients to prove the main theorem.

Proof.

Let \(r_k(\theta) = \cos(\theta + 2\pi k/N)\) for \(k = 0, \ldots, N-1\) be the roots of \(P_N(x; \theta)\), and let \(\xi_k = \cos((2k + 1)\pi/(2N))\) be the roots of \(T_N(x)\). realProjectionsList, chebyshevRootsList

By Theorem~thm:power_sum_equality, the power sums of \(\{r_k(\theta)\}\) and \(\{\xi_k\}\) are equal for all \(j\) with \(1 \leq j < N\):

\begin{equation}...\end{equation}

By Newton's identities (Theorem~thm:newton_identities), the elementary symmetric polynomials \(e_k\) for \(k = 1, \ldots, N\) are determined recursively from the power sums \(p_1, \ldots, p_k\). Since the power sums match for \(j < N\), and the recursion for \(e_k\) with \(k \leq N\) only depends on \(p_1, \ldots, p_k < p_N\), the elementary symmetric polynomials match:

\begin{equation}...\end{equation}

esymm_rotated_roots_invariant

The monic polynomial \(P_N(x; \theta)\) has roots \(r_k(\theta)\) and can be written as

\begin{equation}...\end{equation}

where \(e_k = e_k(r_0(\theta), \ldots, r_{N-1}(\theta))\) are the elementary symmetric polynomials. polynomialFromRealRoots, unscaledPolynomial_monic

Similarly, the Chebyshev polynomial \(T_N(x)\) has leading coefficient \(2^{N-1}\) and roots \(\xi_k\), so

\begin{equation}...\end{equation}

chebyshev_T_leadingCoeff

By Vieta's formulas, the coefficient of \(x^{N-k}\) in \(P_N(x; \theta)\) is \((-1)^k e_k(r_0, \ldots, r_{N-1})\), and the coefficient of \(x^{N-k}\) in \(T_N(x)/2^{N-1}\) is \((-1)^k e_k(\xi_0, \ldots, \xi_{N-1})\). Multiset.prod_X_sub_C_coeff

Since the elementary symmetric polynomials match for \(k = 1, \ldots, N\), we have \(P_N(x; \theta)\) and \(T_N(x)/2^{N-1}\) have the same coefficients for all degrees \(k = 1, \ldots, N\). They may differ only in the constant term (degree 0).

Multiplying both by \(2^{N-1}\), the scaled polynomial

\begin{equation}...\end{equation}

and \(T_N(x)\) have the same coefficients for degrees \(k = 1, \ldots, N\). They may differ in the constant term. scaledPolynomial_leadingCoeff, scaledPolynomial_degree

Hence

\begin{equation}...\end{equation}

where \(c(\theta)\) is given explicitly by eq:explicit_constant as the difference in constant terms. mainTheorem

Moreover, since the coefficients of \(x^k\) for \(k \geq 1\) match exactly, we have coefficient-wise equality:

\begin{equation}...\end{equation}

scaledPolynomial_matches_chebyshev_at_zero This completes the proof of Theorem~thm:main.

6.4 Explicit verification for small N

To build intuition, we verify the theorem explicitly for small values of \(N\).

Example

The rotated roots at \(\theta = 0\) are \(r_k = \cos(2\pi k/3)\) for \(k = 0, 1, 2\):

\begin{equation}...\end{equation}

The unscaled polynomial is

\begin{equation}...\end{equation}

Scaling by \(2^{N-1} = 4\) gives

\begin{equation}...\end{equation}

This differs from \(T_3(x) = 4x^3 - 3x\) by the constant \(c(0) = -1\), as predicted.

Example

The rotated roots at \(\theta = 0\) are \(r_k = \cos(\pi k/2)\) for \(k = 0, 1, 2, 3\):

\begin{equation}...\end{equation}

The unscaled polynomial is

\begin{equation}...\end{equation}

Scaling by \(2^3 = 8\) gives

\begin{equation}...\end{equation}

This differs from \(T_4(x) = 8x^4 - 8x^2 + 1\) by the constant \(c(0) = -1\).