Elements in a group
invariably generate if they still generate after an adversary replaces them by conjugates. This is a function of conjugacy classes: we could say that conjugacy classes
in a group
invariably generate if
whenever
for each
. This concept was invented by Dixon to quantify expected running time of the most standard algorithm for computing Galois groups: reduce modulo various primes
, get your Frobenius element
, and then try to infer what your Galois group is from the information that it contains
(which is defined only up to conjugacy) for each
. If
is secretly
, and you somehow know a priori that
, then the number of primes you need on average to prove that
is the expected number of elements it takes to invariably generate
.
For example, if , then we know that four random elements invariably generate with positive probability, while three random elements almost surely (as
) do not invariably generate. Therefore if
then it typically takes four primes to prove it.
A few days ago Eilidh McKemmie posted a paper on the arxiv which extends this result to finite classical groups: e.g., if is
then, for large enough constant
and
, four random elements invariably generate with positive probability, but three do not. (The bounded-rank case is rather different in character, and I think two elements suffice.) The proof is pretty cool: invariable generation in
is related to invariable generation in the Weyl group, which is either
or
, and we already understand invariable generation for these groups (using a small trick for the latter).
I believe the restriction to large enough constant is a technical rather than essential problem. Assuming it can be overcome, we will be able to deduce the following rather clean statement: If
is a finite simple group then four random elements invariably generate
with probability bounded away from zero. Moreover, if the rank of
is unbounded then three random elements do not.