Locally Confused

An unusual application of the Skolem-Noether theorem

last edited around 2020-08-02

\(\DeclareMathOperator{\End}{End} \DeclareMathOperator{im}{Im} \DeclareMathOperator{diag}{diag}\)

CSAs and Skolem-Noether

Definition 1
  • A \(k\)-Algebra is a ring \(A\) with a designated inclusion \(\iota\colon k\hookrightarrow Z(A)\), and a homomorphism of \(k\)-Algebras is a ring homomorphism commuting with this inclusion. Out of brevity, we will omit the inclusion and identify \(k\) with \(\iota(k)\) where unambiguous.
  • \(A\) is centrally simple (“CSA”) if it has no two-sided ideals and \(Z(A) = k\).

It is quite easy to prove:

Lemma 2
If \(V\) is an \(n\)-dimensional \(k\)-Vector space, then \(\End_k(V)\) is a CSA.
Proof

Choosing a basis gives us an isomorphism to \(M_n(k)\). Fix any matrix \(M\). Since two-sided ideals are closed under addition as well as multiplication of any matrix from both sides, we can bring \(M\) into a form where the first column contains one \(1\) and otherwise zeroes. Multiplying from the right with \(\diag(1, 0, \ldots)\), we get a matrix \(E_{i,j}\) whose only non-zero enrty is a single \(1\) at \((i, j)\). Appropriate conjugation can move \((i,j)\) around arbitrarily, so the generated two-sided ideal \((M)\) must contain \(\{E_{i,j}\mid 1\leq i,j\leq n\}\), which of course is enough to build all matrices given addition and scaling. Thus \((M) = M_n(k)\).

To identify the center, note that a matrix \(M\) commutes with \(\diag(0,\ldots,1,\ldots,0)=E_{ii}\) if and only if projection on the \(i\)-th column and row produce the same result. This can only be the case if the \(i\)-th row and column only contain a nonzero entry in \((i,i)\). Thus, the only way to commute with all matrices – in particular, all \(E_{ii}\) – is if it is diagonal. On the other hand, if \(P_{ij}\) is the matrix corresponding to the elementary row operation of adding the \(j\)-th row from the \(i\)-th when multiplied from the left, doing so from the right results in the column operation of adding the \(i\)-th column to the \(j\)-th. These things can only amount to the same if \(E_{ii}=E_{jj}\). This works for arbitrary \(1\leq i,j\leq n\), effectively restricting us to a multiple of our identity. Therefore, \(Z(M_n(k))\subseteq \{\lambda\mathbb 1\mid \lambda \in k\} = \iota(k)\), hence equality.

Skolem-Noether now says:

Theorem 3
Let \(f, g\colon A\to B\) be two morphisms of \(k\)-Algebras. If \(B\) is a finite-dimensional CSA, then \(f\) and \(g\) are conjugated by an element \(b\in B\), i.e. \(f(a) = bg(a)b^{-1}=: {}^b g(a)\) for all \(a\in A\).

As a corollary, all automorphisms of a finite-dimensional CSA must be inner. Powerful stuff!

Application: Linear operators

Now for the core idea: If \(S, T\in \End_k(V)\) are two operators with the same minimal polynomial \(\mu\), can't we somehow use that structural similarity to find a conjugating element? Because then we would already be finished: For any \(\mu(x)\) that splits into \(\prod_i{(x-\lambda_i)}^{\alpha_i}\) (in particular if \(k\) is algebraically closed) and any given basis \(\mathscr B\), the operator corresponding to the Jordan matrix \(J\) with eigenvalues \(\lambda_i\) and multiplicities \(\alpha_i\) has \(\mu\) as its minimal polynomial. In that case, any \(T\) with minimal polynomial \(\mu\) would be conjugated to \(J\) via, say, \(S = {}^B J\), Implying that \(\{Bb_i\mid b_i\in \mathscr B\}\) is a basis bringing \(S\) into jordan normal form.

You might have guessed how to formally prove that \(S\) and \(T\) are indeed conjugated:

Theorem 4
Let \(S, T\in \End_k(V)\) where \(V\) is finite dimensional. If \(S\) and \(T\) have the same minimal polynomial \(\mu\in k[x]\), they are conjugated.
Proof
Let \(f, g\colon k[x]/(\mu)\to \End_k(V)\) be given by the evaluations \(f( p + (\mu)) := p(S), g(p+(\mu)):=p(T)\). Observing that \(f,g\) are \(k\)-Algebra homomorphisms and that \(\End_k(V)\) is a CSA, apply Skolem-Noether to \(f\) for the polynomial \(x+(\mu)\).

As a closing remark: Please don't ever teach linear algebra that way.