Abstract Algebra

\gdef\delim#1#2#3{\mathopen{}\mathclose{\left#1 #2 \right#3}} \gdef\p#1{\delim({#1})} \gdef\set#1{\delim\{{#1}\}} \gdef\floor#1{\delim\lfloor{#1}\rfloor} \gdef\ceil#1{\delim\lceil{#1}\rceil} \gdef\norm#1{\delim\vert{#1}\vert} \gdef\gen#1{\delim\langle{#1}\rangle} \gdef\S{\mathcal S} \gdef\G{\mathcal G} \gdef\vec#1{\bf{#1}}

A binary operation is a total function f:\S×\S →\S for some set \S. Abstract algebra is the study of such binary operations.

Due to historical reasons it is often presented as being about 'generalized numbers' called structures. This causes the main emphasis to be on the domain \S rather than the operation f. It also causes an eagerness to use infix notation for f. I find that a more neutral presentation better highlights the generality, so that is what I attempt here. This is similar to the approach taken in universal algebra.

https://en.wikipedia.org/wiki/Category:Properties_of_binary_operations https://en.wikipedia.org/wiki/Class_of_groups

https://www.math.uwaterloo.ca/~snburris/htdocs/UALG/univ-algebra.pdf

Pre-requisites

Sets, predicates, orders, sequences, etc

domain, codomain, image

partial, total.

injective, surjective, bijective,

transformation, permutation.

identity, constant.

Number of (total) functions: \norm{\mathcal B}^{\norm{\mathcal A}}. partial functions: \p{\norm{\mathcal B} +1}^{\norm{\mathcal A}} Permutations: \norm{\mathcal A}!.

Unary operation

A unary operation is a total function f:\S→\S.

A collision is a pair of elements a,b ∈ \S such that f(a) = f(b).

Either f has collisions or it is a bijection, invertible and a permutation.

one-way function. hash function.

https://en.wikipedia.org/wiki/Cycle_detection

A directed graph where the out-degree is always exactly one.

Parallel cycle finding:

https://www.cs.csi.cuny.edu/~zhangx/papers/P_2018_LISAT_Weber_Zhang.pdf

https://people.scs.carleton.ca/~paulv/papers/JoC97.pdf

TODO: Rainbow tables.

Example applications: Pollard Rho factorization, discrete logarithm, password cracking, hash collision finding.

Two variables

Note. Many of these definitions generalizes to functions with unequal arguments, f:\mathcal A × \mathcal B → \mathcal C.

Identity element

For a binary operation f:\S×\S →\S an element e ∈ \S is a left-identity iff for all a ∈ \S

f(e,a) = a

For a binary operation f:\S×\S →\S an element e ∈ \S is a right-identity iff for all a ∈ \S

f(a,e) = a

If a binary operation f:\S×\S →\S has both a left-identity e_\mathrm{L} and right-identity e_\mathrm{R}, they must be the same

e_\mathrm{L} = f\p{e_\mathrm{L}, e_\mathrm{R}} = e_\mathrm{R}

furthermore this identity element e must be unique, for if there was a second e' that is also both a left- and right-identity then

e = f\p{e, e'} = e'

so if an element e∈\S is both a left- and right-identity, it is the unique identity element for f.

Absorbing element

For a binary operation f:\S×\S →\S an element z ∈ \S is left-absorbing if for all a ∈ \S

f(z,a) = z

For a binary operation f:\S×\S →\S an element z ∈ \S is right-absorbing if for all a ∈ \S

f(a,z) = z

If a binary operation f:\S×\S →\S has both a left-absorbing element z_\mathrm{L} and right-absorbing element z_\mathrm{R}, they must be the same

z_\mathrm{L} = f\p{z_\mathrm{L}, z_\mathrm{R}} = z_\mathrm{R}

furthermore this absorbing element z must be unique, for if there was a second z' that is also both a left- and right-absorbing then

z = f\p{z, z'} = z'

so if an element z∈\S is both left- and right-absorbing, it is the unique absorbing element for f.

Examples. Zero is the identity element for addition and the absorbing element for multiplication. Negative and positive infinity are the identity and absorbing elements for \operatorname{max}, and reversed for \operatorname{min}.

Cancelability

https://en.wikipedia.org/wiki/Cancellation_property

https://en.wikipedia.org/wiki/Cancellative_semigroup

Inverses

Given a binary operation f:\S×\S →\S with an identity element e∈\S. An element a ∈ \S has a left-inverse b ∈ \S if

f(b,a) = e

Given a binary operation f:\S×\S →\S with an identity element e∈\S. An element a ∈ \S has a right-inverse b ∈ \S if

f(a,b) = e

If f is associative the f-inverse element is unique, if it exists. If there where two elements b, b' ∈ \G we have

b = f(b,e) = f(b,f(a, b')) = f(f(b,a), b') = f(e, b') = b'

Given a binary operation f:\S×\S →\S with an identity element e∈\S. \S is f-invertible if an f-inverse exists for all elements of \S.

This defines a new unary operator g:\S→\S mapping each element to its unique f-inverse. This inverse operator satisfies

f(a, g(a)) = f(g(a), a) = e

Commutativity

A binary operation f:\S×\S →\S is commutative if for all a,b∈\S

f(a,b) = f(b,a)

A binary operation f:\S×\S →\S with inverse g:\S→\S is anti-commutative if for all a,b∈\S

f(a,b) = g(f(b,a))

For a binary operation f:\S×\S →\S an element a ∈ \S is a center if for all b ∈ \S

f(a,b)=f(b,a)

Associativity

A binary operation f:\S×\S →\S is associative if for all a,b,c∈\S

f(a, f(b,c)) = f(f(a,b), c)

Given a binary operation f:\S×\S →\S and an element a ∈ \S we can define the sequence

\begin{aligned} a_1 &= a &\quad a_{n+1} = f\p{a, a_n} \end{aligned}

An example of an operator that is commutative but not associative is the average f(a,b) = \frac{a + b}{2}.

Question Is there an example of commutativity without power-associativity?

A binary operation f:\S×\S →\S is flexible if for all a,b∈\S

f(a, f(b, a)) = f(f(a, b), a)

A binary operation f:\S×\S →\S is alternative if for all a,b∈\S

\begin{aligned} f(a, f(a, b)) &= f(f(a, a), b) & f(b, f(a, a)) &= f(f(b, a), a) & \end{aligned}

Artin's theorem: any alternative f is associative on every subset of two elements.

Idempotence

For a binary operation f:\S×\S →\S an element a ∈ \S is idempotent if

f(a,a)=a

If all elements of \S are idempotent we say that f is idempotent.

https://en.wikipedia.org/wiki/Idempotence

If f is commutative, associative and idempotent it is called a meet. We can define a partial order on \S by a ≤ b iff f(a, b) = a.

https://en.wikipedia.org/wiki/Join_and_meet#Universal_algebra_approach

Power associativity

f is power-associative if for all a and n,m ∈ \N^+

a_{n+m} = f\p{a_n, a_m}

If we reduce a sequence of n copies of a by applying f to pairs of elements we will always get a_n regardless of what order the f's are applied in. If f is written infix as * we can unambiguously write

a_n = \underbrace{a*a*⋯*a}_{ \begin{array}{rl} \small n &\!\! \text{copies of } a \\ \small \p{n-1}&\!\!\text{times } * \end{array}}

This sequence is exponentiation :\S×\N^+→\S. If f has an identity element e, we set a_0 = e. If f has an inverse b for a with its own exponentiation sequence b_n, then we set a_{-n} = b_n. This extends the domain to \N and \Z respectively. Note that all of these extensions are compatible with each other and the above relations.

Power-associativity implies that f is associative on the set \set{a_i} as f(a_i, a_j) = a_{i+j} = f(a_j, a_i). f need not be associative on the whole \S.

Question: Is there an example of a commutative operator that is not power-associative?

From the definition we can compute a_n using n-1 applications of f, but the power associativity allows us to do better. Consider the following two methods to compute a_{15}, requiring 6 and 5 applications of f respectively instead of 14:

\begin{aligned} a_1 &= a &\quad\quad a_1 &= a \\ a_2 &= f(a_1, a_1) & a_2 &= f(a_1, a_1) \\ a_4 &= f(a_2, a_2) & a_3 &= f(a_1, a_2) \\ a_8 &= f(a_4, a_4) & a_6 &= f(a_3, a_3) \\ a_3 &= f(a_1, a_2) & a_{12} &= f(a_6, a_6) \\ a_7 &= f(a_3, a_4) & a_{15} &= f(a_3, a_{12}) \\ a_{15} &= f(a_7, a_8) & \\ \end{aligned}

On the left we compute all powers of two up to \floor{\log_2 n} and then combine them to create n, requiring \operatorname{pop}\p{n} - 1 applications of f where \operatorname{pop} is the binary Hamming weight (the number of non-zero digits in the binary expansion of n). This method is called exponentiation by squaring, works for all n and requires exactly \floor{\log_2 n} + \operatorname{pop}\p{n} - 1 applications of f. On the right we show a method specific to 15 that performs better. Such methods rely on addition chains and are called addition-chain exponentiation. Finding optimal addition chains is an ongoing research topic Fla23. If a has an easy to compute inverse, or if multiple exponentiations need to be computed in parallel, or if there is other structure, see Gor97. In general exponents can be computed in ≤\ceil{\log_2 n} application of f, so logarithmic in n.

Since \S is finite at some point the sequence a_i will have to repeat. There will be a tail and a cycle. Does power associativity proof things about the tail and cycle length?

\begin{aligned} a_{T + L} &= a_{T} & f(a_T, a_L) &= f(a_L, a_T) = a_T \\ a_{T + L + i} &= a_{T + i} & f(a_T, a_{L + i}) &= f(a_T, a_i) \\ \end{aligned}

This means a_L acts as an identity to a_T.

wlog we have k ≥ 1 and l ≥ 1 such that a_{k} = a_{k+l} and are the smallest such numbers. If k > 1 we can write k = k' + 1

a_k = a_{k+l} = f(a_k, a_l)

The first 'inverse' operation is discrete logarithm \S×\S → \Z where n = \log_a b is a number (if it exists) such that a_n = b. This is a much harder problem and subject to active competition on specific instances. The best methods have additional assumptions on f and \S. For the generic case there does not appear to be a method better than a brute-force enumerating of a_n.

The second 'inverse' operation is n-th root \S×\Z → \S where a = \sqrt[n]{b} is a number (if it exists) such that a_n = b.

Another function is the order \S → \Z or period of an element where k = \operatorname{ord}\p{a} is the smallest k such that a_{n+k} = a_n for all n.

Q: What about f guarantees that f(-,a) is a cyclic? Ans: If a has an inverse, then (-,a) has an inverse f(-,-a) and thus both are permutations.

Q: What about f guarentees that orb(a) divides |\S|?

If k is the least common multiple of the orders of all elements, then for each a we have a_{1 + k} = a. We also have a_{1 + k} = f(a_k, a), hence

f(a_k, a) = f(a, a_k) = a

If a_k is the same regardless of a, then it must be the identity element. But does the converse hold? If f has an identity element, does it need to appear as a_k?

TODO: Langrange theorem, Cauchy theorem.

https://en.wikipedia.org/wiki/Order_(group_theory) https://en.wikipedia.org/wiki/Lagrange's_theorem_(group_theory) https://en.wikipedia.org/wiki/Cauchy%27s_theorem_(group_theory) ( a special case of Sylow ) https://en.wikipedia.org/wiki/Sylow_theorems

Question: If \S is finite the sequence a_i will have to be cyclic. What can we say about the cycle length?

https://vixra.org/pdf/1509.0011v1.pdf (Generalization of Lagrange's Theorem to certain semigroups)

https://en.wikipedia.org/wiki/Cycle_detection

Notation

In additive notation the binary operation is written as infix +, the identity element (if exits) as 0 and the inverse operator as prefix - (i.e. the additive inverse of a is -a). Exponentiation is written as n⋅a.

In multiplicative notation the binary operation is written as infix , the identity element (if it exits) as 1, the absorbing element (if it exits) as 0, and the inverse operator as suffix {}^{-1} (i.e. the additive inverse of a is a^{-1}). Exponentiation is written as a^n.

Structures

https://en.wikipedia.org/wiki/Magma_(algebra)#Classification_by_properties

https://en.wikipedia.org/wiki/Special_classes_of_semigroups

Depending on the properties of f the structure \p{\S, f} goes by various names

associativeidentityinvertiblecommutativeidempotent
magma
semigroup
monoid
group
abelian group
unital magma
quasigroup
associative quasigroup
loop
band
semi-lattice
bounded semi-lattice

Classification theorems

For many of these structures there are classification theorems enumerating all possible structures up to isomorphism.

Summary

Derived operators

Algorithms

https://en.wikipedia.org/wiki/Computational_group_theory

Two operations

Distributive

Given two binary operations f,g:\S × \S → \S, we say g is left-distributive over f iff for all a, b,c ∈ \S

g(a, f(b, c)) = f(g(a, b), g(a, c))

Similarly, we say g is right-distributive over f iff for all a, b,c ∈ \S

g(f(a, b), c) = f(g(a, c), g(b, c))

If g is both left- and right-distributive over f we stay g is distributive over f. If g is commutative, left- and right-distributive are equivalent.

Absorption

Given two binary operations f,g:\S × \S → \S, we say f and g$ satisfy the absorption law iff for all a, b ∈ \S

f(a, g(a, b)) = g(a, f(a, b)) = a

It follows that they are both idempotent, as a = f(a, g(a, f(a, a))) = f(a, a) and symmetrically with f and g reversed.

If f and g are also commutative and associative, the resulting structure with two operations is a lattice.

https://en.wikipedia.org/wiki/Lattice_(order)

https://en.wikipedia.org/wiki/Boolean_algebra_(structure)

Semi-rings

A neat example of a semi-ring is ℝ∪\set{∞} with \min and +. This is the tropical semiring.

https://en.wikipedia.org/wiki/Tropical_semiring https://arxiv.org/abs/2309.11256

Rings

A ring \p{\mathcal R, +, ⋅} such that \p{\mathcal R, +} is an abelian group (written additively) and \p{\mathcal R, ⋅} is a monoid (written multiplicatively) and +, ⋅ satisfy the distributive property: for all a,b,c∈\mathcal R

\begin{aligned} a⋅\p{b+c} &= a⋅b + a⋅c \\ \p{a+b}⋅c &= a⋅c + b⋅c \\ \end{aligned}

A commutative ring is a ring \p{\mathcal R, +, ⋅} where is commutative.

A field is a commutative ring \p{\mathcal R, +, ⋅} where \p{\mathcal R \setminus \{0\}, ⋅} is an abelian group.

rngs ⊃ rings ⊃ commutative rings ⊃ integral domains ⊃ integrally closed domains ⊃ GCD domains ⊃ unique factorization domains ⊃ principal ideal domains ⊃ Euclidean domains ⊃ fields ⊃ algebraically closed fields

Scalar multiplication

An R-module \p{\mathcal V, +, ⋅} is an abelian group \p{\mathcal V, +} with a scalar multiplication operation ⋅:R×\mathcal V → \mathcal V where R is a ring such that

\begin{aligned} \p{a ⋅ b} ⋅ \vec v &= a ⋅ \p{b ⋅ \vec v} \\ 1⋅\vec v & = \vec v \\ a⋅\p{\vec u + \vec v} &= a⋅\vec u + \mathit a⋅\vec v \\ \p{a + b}⋅\vec v &= a ⋅ \vec v + \mathit b ⋅ \vec v \end{aligned}

This is also called a module over R.

The exponentiation in the abelian group is a \mathcal V × ℤ → \mathcal V is a scalar multiplication, making any abelian group a -module. Conversely, it can be shown that any -module is an abelian group.

A basis for an R-module M is a set of linearly independent vectors {\vec v}_i such that \operatorname{span}_R\p{{\vec v}_0, {\vec v}_1, …} = M.

A free module is a module that has a basis.

A vector space is a module over a field.

A matrix space

Every power-associative magma is a \Z-module. Every Abelian group is a \Z/n\Z-magma. If n is prime it is a vector space over the prime field \mathbb{F}_n. Note that this applies to elliptic curves. (Q: What if n is p^k?)

Composition

https://en.wikipedia.org/wiki/Composition_ring

https://en.wikipedia.org/wiki/Polynomial_decomposition

Derivatives

https://en.wikipedia.org/wiki/Formal_derivative

https://en.wikipedia.org/wiki/Derivation_(differential_algebra)

Morphisms

https://en.wikipedia.org/wiki/Category_(mathematics)#Types_of_morphisms

A function between two structures of the same type f:G→H is called a homomorphism if it preserves the structure: It maps any constants to the corresponding constants, i.e. f(0_G)=0_H, f(1_G)=1_H, etc. It preserves any unary operators, i.e. f(g_G(x)) = g_H(f(x)). It perserves binary operators f(g_G(a, b)) = g_H(f(a), f(b)) and so on for higher arities.

A homomorphism f:G→H with f injective is a monomorphism.

A homomorphism f:G→H with f surjective is an epimorphism.

A homomorphism f:G→H with G = H is a endomorphism.

A homomorphism f:G→H is a isomorphism if there is an inverse function g:H→G that is also a homomorphism. If these exists G and H are isomorphic, denoted G ≅ H.

A homomorphism f:G→H that is both endomorphic and isomorphic is an automorphism.

Classification theorems

Up to isomorphism many algebraic structures are enumerable or otherwise constained in their instances.

https://en.wikipedia.org/wiki/Classification_of_finite_simple_groups

https://en.wikipedia.org/wiki/Wedderburn%E2%80%93Artin_theorem

https://en.wikipedia.org/wiki/Structure_theorem_for_finitely_generated_modules_over_a_principal_ideal_domain

All finite fields have order p^n for some prime p and positive integer n. All finite fields of the same order are isomorphic.

https://en.wikipedia.org/wiki/Structure_theorem_for_finitely_generated_modules_over_a_principal_ideal_domain

https://en.wikipedia.org/wiki/Polynomial_ring

https://en.wikipedia.org/wiki/Isomorphism_theorems#Universal_algebra

Remco Bloemen
Math & Engineering
https://2π.com