Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

algebra group theory dummit foote, Summaries of Algebra

algebra group theory dummit foote, david dummit, richard foote, group theory extracted

Typology: Summaries

2021/2022

Uploaded on 11/03/2022

berke-nar
berke-nar 🇹🇷

1 document

1 / 80

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
CHAPTER I
Groups
§1. MONOIDS
Let S be a set. A mapping
sxs.:....s
is sometimes called a law of composition (of S into itself). If x, y are elements of
S, the image of the pair (x, y) under this mapping is also called their product
under the law of composition, and will be denoted by xy. (Sometimes, we also
write x · y, and in many cases it is also convenient to use an additive notation,
and thus to write x + y. In that case, we call this element the sum of x and y.
It is customary to use the notation x + y only when the relation x + y =
y + x holds.)
Let S be a set with a law of composition. If x, y, z are elements of S, then we
may form their product in two ways: (xy)z and x(yz). If (xy)z = x(yz) for all
x, y, z in S then we say that the law of composition is associative.
An element e of S such that ex = x = xe for all x e S is called a unit
element. (When the law of composition is written additively, the unit element
is denoted by 0, and is called a zero element.) A unit element is unique, for if
e' is another unit element, we have
I I
e = ee = e
by assumption. In most cases, the unit element is written simply 1 (instead of e).
For most of this chapter, however, we shall write e so as to avoid confusion in
proving the most basic properties.
A monoid is a set G, with a law of composition which is associative, and
having a unit element (so that in particular, G is not empty).
3
pf3
pf4
pf5
pf8
pf9
pfa
pfd
pfe
pff
pf12
pf13
pf14
pf15
pf16
pf17
pf18
pf19
pf1a
pf1b
pf1c
pf1d
pf1e
pf1f
pf20
pf21
pf22
pf23
pf24
pf25
pf26
pf27
pf28
pf29
pf2a
pf2b
pf2c
pf2d
pf2e
pf2f
pf30
pf31
pf32
pf33
pf34
pf35
pf36
pf37
pf38
pf39
pf3a
pf3b
pf3c
pf3d
pf3e
pf3f
pf40
pf41
pf42
pf43
pf44
pf45
pf46
pf47
pf48
pf49
pf4a
pf4b
pf4c
pf4d
pf4e
pf4f
pf50

Partial preview of the text

Download algebra group theory dummit foote and more Summaries Algebra in PDF only on Docsity!

CHAPT E R I

Groups

§1. M O N O I D S

Let S be a set. A mapping

sxs.:....s

is sometimes called a law of composition (of S into itself). If x, y are elements of S, the image of the pair (x, y) under this mapping is also called their product under the law of composition, and will be denoted by xy. (Sometimes, we also write x ·^ y, and in many cases it is also convenient to use an additive notation, and thus to write x +^ y. In that case, we call this element the^ sum^ of x and y. It is customary to use the notation x + y only when the relation x + y = y +^ x holds.) Let S be a set with a law of composition. If x, y, z are elements of S, then we may form their product in two ways : (xy)z and x(yz). If (xy)z = x(yz) for all x, y, z in S then we say that the law of composition is a ssociative. An element e of S such that ex = x = xe for all x e S is called a unit element. (When the law of composition is written additively, the unit element is denoted by 0, and is called a zero element.) A unit element is unique, for if e'^ is another unit element, we have

e = ee^ I = eI

by assumption. In most cases, the unit element is written simply 1 (instead of e). For most of this chapter, however, we shall write e so as to avoid confusion in proving the most basic properties. A monoid is a set G, with a law of composition which is associative, and having a unit element (so that in particular, G is not empty).

3

4 G ROUPS^ I,^ §

Let G be a monoid, and xb... , xn elements of G (where n is an integer > 1).

We define their product inductively : n

n Xv=X^ l^... Xn =^ (X 1.^ •^ •^ Xn-^1 )Xn^ ·

v= 1

We then have the following rule :

m n^ m+n nxJl· nxm+v= nxv, 11= 1 v= 1 v= 1

which essentially asserts that we can insert parentheses in any manner in our
product without changing its value. The proof is easy by induction, and we shall

leave it as an exercise. One also writes

and we define

m+n n

n XV instead of^ n^ Xm+v

m+ 1 v= 1

0

n Xv=e.

v= 1 As a matter of convention, we agree also that the empty product is equal to the unit element. It would be possible to define more general laws of composition, i.e. maps

S 1 x S 2 --+^ S 3 using arbitrary sets. One can then express associativity and

commutativity in any setting for which they make sense. For instance, for commutativity we need a law of composition

f : S X s --+ T

where the two sets of departure are the same. Commutativity then means

f(x, y) =f(y, x), or xy =yx if we omit the mapping / from the notation. For

associativity, we leave it to the reader to formulate the most general combination of sets under which it will work. We shall meet special cases later, for instance arising from maps

S x S --+^ S and S x T --+ T.
Then a product (xy)z^ makes sense with x E S, y E S, and z E T. The product
x(yz) also makes sense for such elements x, y, z and thus it makes sense to say
that our law of composition is associative, namely to say that for all x, y, z as

above we have (^) (xy)z (^) = x(yz). If the law of composition of G is commutative, we also say that G is com mutative (or abelian).

6 G ROUPS I,^ §

f:^ I^ x^ J^ -+^ G^ a mapping into a commutative monoid which takes the value^ e

for almost all pairs (i, j). Then

n r n f(i,j)] =^ n^ [n J(i,j)]. i e l Le J (^) j e J^ i e l We leave the proof as an exercise.

As a matter of notation, we sometimes write 0 f(i), omitting the signs

iE J, if the reference to the indexing set is clear.

Let x be an element of a monoid G. For every integer^ n^ > 0^ we define^ xn

to be n n 1 X,

so that in particular we have^ x^0 = e, x1 = x, x2 = xx,.... We obviously have
x<n+m)^ = xnxm and (xn)m = xnm. Furthermore, from our preceding rules of
associativity and commutativity, if x, y are elements of^ G^ such that^ xy = yx,
then (xy)n^ = xnyn. We leave the formal proof as an exercise.

If S, S' are two subsets of a monoid G, t hen we define SS' to be the subset consisting of all elements xy, with xES and^ yES'.^ Inductively, we can define the product of a finite number of subsets, and we have associativity. For in

stance, if S, S', S" are subsets of G, then (SS')S" = S(S'S"). Observe that GG = G

(because G^ has a unit element).^ If^ xE G, then we define xS to be^ {x }S, where {x} is the set consisting of the single element (^) x. Thus xS consists of all elements xy,^ with^ yES. By a submonoid of G , we shall mean a subset H of G containing the unit element e, and such that, if x, y E H then xy E H (we say that H is closed under the law of composition). It is then clear that H is itself a monoid , under the law of composition induced by that of G.

If x is an element of a monoid G, then the subset of powers xn^ (n = 0, 1 ,... )

is a submonoid of G. The set of integers > 0 under addition is a monoid. Later we shall define rings. If R is a commutative ring, we shall deal with multiplicative subsets S, that is subsets containing the unit element, and such that if (^) x, yES then (^) xy E S. Such subsets are monoids. A routine example. Let N be the natural numbers , i. e. the integers > 0. Then N is an additive monoid. In some applications , it is useful to deal with a multiplicative version. See the definition of polynomials in Chapter II , §3 , where a higher-dimensional version is also used for polynomial s in several variables. An interesting example. We assume that the reader is familiar with the terminology of elementary topology. Let M be the set of homeomorphism classes of compact (connected) surfaces. We shall define an addition in M. Let S, S' be compact surfaces. Let D be a small disc in S, and D' a small disc in S'. Let C, C' be the circles which form the boundaries of D and D' respectively. Let D0 , D� be the interiors of D and D' respectively, and glue S-D0 to S'-D0 by identifying C with C'. It can be shown that the resulting surface is independent,

I, §2 G ROUPS^7

up to homeomorphism, of the various choices made in the preceding construc tion. If u, u'^ denote the homeomorphism classes of S and S' respectively, we define u +^ u'^ to be the class of the surface obtained by the preceding gluing process. It can be shown that this addition defines a monoid structure on M,

whose unit element is the class of the ordinary 2-sphere. Furthermore, if r

denotes the class of the torus, and 1t denotes the class of the projective plane, then every element u of M has a unique expression of the form

u = nr + mn
where n is an integer > 0 and m = 0, 1, or 2. We have 3n = r + n.

(The reasons for inserting the preceding example are twofold : First to relieve the essential dullness of the section. Second to show the reader that monoids exist in nature. Needless to say, the example will not be used in any way throughout the rest of the book.)

Still other examples. At the end of Chapter III, §4 , we shall remark that

isomorphism classes of modules over a ring form a monoid under the direct sum. In Chapter XV , § 1 , we shall consider a monoid consisting of equivalence classes of quadratic forms.

§2. G R O U PS

A group G is a monoid, such that for every element x E G there exists an

element y E G such that xy = yx = e. Such an element y is called an inverse for
x. Such an inverse is unique, because if y'^ is also an inverse for x, then
y' = y'e = y'(xy) = (y'x)y = ey = y.

We denote this inverse by x-^1 (or by -x when the law of composition is written additively).

For any positive integer n, we let x-n^ = (x-1)n. Then the usual rules for

exponentiation hold for all integers, not only for integers > 0 (^) (as we pointed out for mono ids in § 1 ). The trivial proofs are left to the reader. In the definitions of unit elements and in verses, we could also define left units and left inverses (in the obvious way). One can easily prove that these are also units and inverses respectively under suitable conditions. Namely:

Let G^ be a set with an associative law of composition, let e be a left unit for
that law, and assume that every element has a left inverse. Then e is a unit,
and each left inverse is also an inverse. In particular, G is a group.
To prove this, let a E G and let b E G be such that ba = e. Then
bab = eb = b.

Multiplying on the left by a left inverse for b yields

ab = e,
or in other words, b is also a right in verse for a. One sees also that a is a left

I, §2 G ROUPS 9

with r E Z and r prime to n. A generator for this group is called a primitive n-th root of unity. Example. The direct product. Let G I , G2 be groups. Let G I X G2 be the direct product as sets , so G I x G2 is the set of all pairs (xi , x2 ) with X; E G;. We define the product componentwise by

(xi , x2)( yi , Y2)^ =^ (xi YI , X2 Y2 ). Then G 1 X G2 is a group , whose unit element is ( e 1 , e2) (where e; is the unit element of G; ). Similarly , for n groups we define G I X • · · X G (^) n to be the set of n-tuples with X; E G; (i^ = 1 ,... , n) , and componentwise multiplication. Even more generally , let I be a set , and for each i E I, let G; be a group. Let G = fi^ G; be the set-theoretic product of the sets G;. Then G is the set of all families (x; );E1 with X; E G;. We can define a group structure on G by compo nentwise multiplication , namely , if (x; );EJ and ( Y; );EJ^ are^ two^ elements of^ G,^ we define their product to be ( X;Y; );EJ · We define the inverse of (x; );EJ to be (xj^1 );EJ· It is then obvious that G is a group called the direct product of the family.

Let G be a group. A subgroup H of G is a subset of G containing the unit

element, and such that H is closed under the law of composition and inverse
(i.e. it is a submonoid, such that if x E H then x- 1^ E H). A subgroup is called

trivial if it consists of the unit element �lone. The intersection of an arbitrary non-empty family of subgroups is a subgroup (trivial verification). Let G be a group and S a subset of G. We shall say that S generates G, or that S is a set of generators for G, if every element of G can be expressed as a product of elements of S or inverses of elements of S, i.e. as a product x 1 • • • xn where each xi or xi- 1^ is in S. It is clear that the set of all such products is a subgroup of G (the empty product is t he unit element), and is the smallest sub group of G containing S. Thus S generates G if and only if the smallest subgroup of G containing S is G itself. If G is generated by S, then we write G = (S). By definition , a cyclic group is a group which has one generator. Given elements xi ,... , Xn E G , these elements generate a subgroup (xb... , xn), namely the set of all elements of G of the form

X�l · · · x�; with k1 , • • • , krE Z.

A single element x E G generates a cyclic subgroup.

Example. There are two non-abelian groups of order 8. One is the group of symmetries of the square , generated by two elements u, T such that

u4 = T^2 = e and TUT- I^ = u3.

The other is the quaternion group , generated by two elements , i, j such that if we put k = ij and m = i^2 , then lJ^.^. = mJl ..^.

After you know enough facts about groups, you can easily do Exercise 35.

1 0 GROUPS I, §

Let G, G' be monoids. A monoid-homomorphism (or simply (^) homomorphism)

of G into G' is a mappingf : G � G' such that f(xy) = f(x)f(y) for all x, y E G,

and mapping the unit element of G into that of G'. If G, G' are groups, a group homomorphism of G into G' is simply a monoid-homomorphism. We sometimes say : "Let f : G � G' be a group-homomorphism " to mean : "Let G, G' be groups, and let fbe a homomorphism from G into G'." Letf: G � G' be a group-homomorphism. Then f(x - 1 ) = f(x) - 1

because if^ e, e'^ are the unit elements of G, G' respectively, then
e' = f(e) = f(xx -^1 ) = f(x)j'(x - 1).

Furthermore, if G, G' are groups and f: G -+ G' is a map such that f(xy) = f(x)f(y)

for all x, y in G, then f(e) =^ e'^ because f(ee)^ =^ f(e) and also^ =^ f(e)f(e).
M ultiplying by the inverse off(e) shows that f(e) = e'.

Let G, G' be monoids. A homomorphism! : G -+ G' is called an isomorphism

if there exists a homomorphism g: G' � G such that f^ o g^ and^ g o^ f^ are the

identity mappings (in G' and G respectively). It is trivially verified that f is

an isomorphism if and only if f is bijective. The existence of an isomorphism

between two groups G and G' is sometimes denoted by G � G'. If G = G', we say that isomorphism is an automorphism. A homomorphism of G into itself is also called an endomorphism.

Example. Let G be a monoid and x an element of G. Let N denote the (additive) monoid of integers 2 0. Then the mapf: N -+ G such thatf(n) = xn is a homomorphism. If G is a group, we can extend fto a homomorphism of Z

into G (xn is defined for all n E Z, as pointed out previously). The trivial proofs

are left to the reader.

Let n be a fixed integer and let G be a commutative group. Then one verifies

easily that the map

from G into itself is a homomorphism. So is the map x r--+ x-^1. The map x r--+ xn is called the n-th power map. Example. Let I = { i} be an indexing set, and let { G;} be a family of groups. Let G = fi G;^ be their direct product. Let p;: G � G;

be the projection on the i-th factor. Then P; is a homomorphism.

Let G be a group, S a set of generators for G, and G' another group. Let

f: S -+ G' be a map. If there exists a homomorphism f of G into G' whose

restriction to S is f, then there is only one.

1 2 G ROU PS I, §

If (x, y) is in its kernel, then x = y-^1 , whence x lies in both H and K, and x = e,
so that y = e also, and our map is an isomorphism.

We observe that Proposition 2. 1 generalizes by induction to a finite number

of subgroups Hb... , Hn whose elements commute with each other, such that

and such that

H1 •^ • • H n = G'
Hi + 1 n (H 1 · • • HJ = e.

In that case, G is isomorphic to the direct product

H 1 X · • · X Hn.
Let G be a group and H a subgroup. A left coset of H in G is a subset of
G of type aH, for some element a of G. An element of aH is called a coset
representative of aH. The map x 1--+ ax induces a bijection of H onto aH.

Hence any two left cosets have the same cardinality.

Observe that if a, b are elements of G and aH, bH are cosets having one
element in common, then they are equal. Indeed, let ax = by with x, y E H.
Then a = byx -^1 • But yx -^1 E H. Hence aH = b(yx -^1 )H = bH, because for
any z E H we have zH = H.
We conclude that G is the disjoint union of the left cosets of H. A similar
remark applies to right cosets (i.e. subsets of G of type Ha). The number of left
cosets of H in G is denoted by (G : H), and is called the (left) index of H in G.
The index of the trivial subgroup is called the order of G and is written ( G : 1 ).

From the above conclusion, we get :

Proposition 2. 2. Let G be a group and H a subgroup. Then
( G : H)( H : 1 ) = ( G : 1 ),
in the sense that if two of· these indices are finite, so is the third and equality
holds as stated. If ( G : 1 ) is finite, the order of H divides the order of G.
More generally, let H, K be subgroups of G and let H ::J K. Let {xi} be a
set of(left) coset representatives of K in H and let {yi} be a set of coset repre
sentatives of H in G. Then we contend that {yixJ is a set of coset representa
tives of K in G.
Proof Note that

Hence

G = U yi H

j

G = U yixi K.

i, j

(disjoint),

(disjoint).

We must show that this union is disjoint, i.e. that the yixi represent distinct

cosets. Suppose

I , §

Y·X· KJ I = Y··X·· KJ I

NORMAL SUBGROUPS 1 3

for a pair of indices {j, i) and (j', i'). Multiplying by H on the right, and noting that X;, xi' are in H, we get Y·HJ = y (^) J.. H' whence Yi = Yr · From this it follows that xi K = X;· K and therefore that X; = X;·, as was to be shown.

The formula of Proposition 2. 2 may therefore be generalized by writing

(G : K) = (G : H)(H : K) , with the understanding that if two of the three indices appearing in this formula are finite , then so is the third and the formula holds. The above results are concerned systematically with left cosets. For the right cosets , see Exercise 1 0. Example. A group of prime order is cyclic. Indeed , let G have order p and let a E G, a =I= e. Let H be the subgroup generated by a. Then #(H) divides p and is =I= 1 , so #(H) = p and so H = G , which is therefore cyclic. Example. Let Jn = { 1 ,... , n }. Let Sn be the group of permutations of ln. We define a transposition to be a permutation T such that there exist two elements r =I= s in Jn for which T(r)^ =^ s, T(s) =^ r, and T(k) =^ k for all k =I= r, s. Note that the transpositions generate Sn. Indeed , say u is a permutation , u (n) = k =I= n. Let T be the transposition interchanging k, n. Then TU leaves n fixed , and by induction , we can write TU as a product of transpositions in Perm(] n- I)' thus proving that transpositions generate s n. Next we note that #(Sn) = n!. Indeed, let H be the subgroup of Sn consisting of those elements which leave n fixed. Then H may be identified with Sn-t· If U; (i^ =^ 1 ,^...^ , n)^ is an element of^ Sn^ SUCh that^ U;(n)^ =^ i,^ then it is immediately verified that u1, • • • , un are coset representatives of H. Hence by induction (Sn : 1 ) = n(H : 1 ) =^ n!. Observe that for u; we could have taken the transposition T;, which interchanges i (^) and n (except for i (^) = n, where we could take un to be the identity).

§3. N O R M A L S U B G R O U P S

We have already observed that the kernel of a group-homomorphism is a subgroup. We now wish to characterize such subgroups. Letf : G __.. G' be a group-homomorphism, and let H be its kernel. If x is an

element of G, then xH = Hx, because both are equal to f-^1 (f(x)).^ We can
also rewrite this relation as xH x -^1 = H.

I, §3 NORMAL SUBGROU PS^ 1 5

Second , let G be the set of all maps Ta,b : R � R such that

Ta,b(x) = ax + b, with a =I= 0 and b arbitrary. Then G is a group under composition
of mappings. Let A be the multiplicative group of maps of the form Ta ,o (iso
morphic to R* , the non-zero elements of R) , and let N be the group of translations

T1 ,b with b E R. Then the reader will verify at once that Ta ,b � a is a homo

morphism of G onto the multiplicative group , whose kernel is the group of

translations, which is therefore normal. Furthermore , we have G = AN = NA ,
and N n A =^ { id}. In the terminology of Exercise 1 2 , G is the semidirect
product of A and N.

Let H be a subgroup of G. Then H is obviously a normal subgroup of its

normalizer N 8. We leave the following statements as exercises :
If K is any subgroup of G containing H and such that H is normal in K, then
K c NH.
If K is a subgroup of NH , then KH is a group and H is normal in KH.
The normalizer of H is the largest subgroup of G in which H is normal.
Let G be a group and H a normal subgroup. Let x, y E G. We shall write

x = y (mod H)

if x and y lie in the same coset of H, or equivalently if xy -^1 (or y - 1x) lie in H. We read this relation " x and y are congruent modulo H." When G is an additive group, then x = 0 (mod H) means that x lies in H, and x = y (mod H) means that x - y (or y - x) lies in H. This notation of congruence is used mostly for additive groups. Let

G' � G .!!. G"

be a sequence of homomorphisms. We shall say that this sequence is exact if

Im f =^ Ker g. For example , if H is a normal subgroup of^ G^ then the sequence

H ..i.. G � G/H

is exact (where j = inclusion and qJ = canonical map). A sequence of homo

morphisms having more than one term, like

G 1 -+It^ G 2 -+l2^ G 3 -+ · · · In -^1 Gn '

is called exact if it is exact at each joint, i.e. if.

lm .fi = Ker fi + 1
for each i = 1,... , n - 2. For example to say that

0 -+ G' � G .!!. G" -+ 0

1 6 G ROU PS I, §

is exact means that f is injective, that lm f = Ker g, and that g is surjective. If

H = Ker g then this sequence is essentially the same as the exact sequence
0 __.. H __.. G __.. G/H --+ 0.

More precisely, there exists a commutative diagram

0 G'^ f^ G^ g^ G"^0

j j^ j

0 H^ G^ G/H^0

in which the vertical maps are isomorphisms, and the rows are exact. Next we describe some homomorphisms, all of which are called canonical. (i) Let G, G' be groups and f: G __.. G' a homomorphism whose kernel

is H. Let cp : G __.. G/H be the canonical map. Then there exists a unique
homomorphism !* : G/H __.. G' such that f = f* o cp, and f* is injective.
To define f* , let xH be a coset of H. Since f(xy) = f(x) for all y E H, we
define f*(xH)^ to be f(x).^ This value is independent of the choice of coset

representative x, and it is then trivially verified that f* is a homomorphism, is injective, and is the unique homomorphism satisfying our requirements. We shall say that f* is induced by f Our homomorphism !* induces an iso111orphism

A. : G/H __.. lmf
of G/H onto the image off, and thusf can be factored into the following succes

sion of homomorphisms :

G � G/H � lm f � G'.

Here, j is the inclusion of lm fin G'.
(ii) Let G be a group and H a subgroup. Let N be the intersection of all
normal subgroups containing H. Then N is normal, and hence is the smallest
normal subgroup of G containing H. Letf : G __.. G'^ be a homomorphism whose
kernel contains H. Then the kernel off contains N, and there exists a unique
homomorphism !* : G/N __.. G', said to be induced by f, making the following

diagram commutative :

G 1 G'

\ I

G/N As before, cp is the canonical map. We can define f* as in ( 1) by the rule

This is well defined, and is trivially verified to satisfy all our requirements.

1 8 GROU PS I,^ §

again called canonical, giving rise to the commutative diagram

0 _____,. H ---+ G _____,. GIH --+ 0

j lf jl

0 __.....,. H'^ __.....,. G'^ __.....,. G'IH'^ --� 0.

Iff is surjective, then J is an isomorphism.

We shall now describe some applications of our homomorphism statements.

Let G be a group. A sequence of subgroups
G = Go ::J G 1 ::J G2 ::J •^ •^ • ::J Gm
is called a tower of subgroups. The tower is said to be normal if each Gi + 1 is
normal in Gi (i = 0,... , m - 1 ). It is said to be abelian (resp. cyclic) if it is
normal and if each factor group GiiGi + 1 is abelian (resp. cyclic).
Let f: G -+ G' be a homomorphism and let
G'^ = G� ::J G'1 ::J •^ •^ •^ ::J G�
be a normal tower in G'. Let Gi = f - 1(GD. Then the Gi (i = 0,... , m) form a
normal tower. If the G� form an abelian tower (resp. cyclic tower) then the Gi

form an abelian tower (resp. cyclic tower), because we have an injective homo morphism

G ;I G i + 1 -+ G ;1 G � + 1
for each i, and because a subgroup of an abelian group (resp. a cyclic group) is

abelian ( resp. cyclic). A refinement of a tower G = G0 ::J G 1 ::J •^ •^ • ::J Gm is a tower which can be obtained by inserting a finite number of subgroups in the given tower. A group is said to be solvable if it has an abelian tower, whose

last element is the trivial subgroup (i.e. Gm = { e} in the above notation).
Proposition 3. 1. Let G be a finite group. An abelian tower of G admits a
cyclic refinement. Let G be a finite solvable group. Then^ G^ admits a cyclic
tower, whose last element is { e}.
Proof The second assertion is an immediate consequence of the first, and
it clearly suffices to prove that if G is finite, abelian, then G admits a cyclic tower.
We use induction on the order of G. Let x be an element of G. We may assume
that x =F e. Let X^ be the cyclic group generated by x. Let G' = GIX. By
induction, we can find a cyclic tower in G', and its in verse image is a cyclic tower
in G whose last element is X.^ If we refine this tower by inserting^ {^ e}^ at the end,

we obtain the desired cyclic tower.

Example. In Theorem 6.4 it will be proved that a group whose order is a prime power is solvable.

I , §3 NORMAL SUBG ROU PS 1 9

Example. One of the major results of group theory is the Fe it-Thompson theorem that all finite groups of odd order are solvable. Cf. [Go 68]. Example. Solvable groups will occur in field theory as the Galois groups of solvable extensions. See Chapter VI , Theorem 7. 2.

Example. We assume the reader knows the basic notions of linear algebra. Let k^ be a field. Let G^ =^ GL(n^ ,^ k)^ be the group of invertible^ n^ x^ n^ matrices in k. Let T = T(n, k) be the upper triangular group; that is , the subgroup of matrices which are 0 below the diagonal. Let D be the diagonal group of diagonal matrices with non-zero components on the diagonal. Let N be the additive group of matrices which are 0 on and below the diagonal , and let U = I + N, where I is the unit n x n matrix. Then U is a subgroup of G. (Note that N consists of nilpotent matrices , i. e. matrices A such that Am = 0 for some positive integer m. Then (I - A) - I^ = I + A + A2 +... + Am - I is computed using the geometric series. ) Given a matrix A E T, let diag(A) be the diagonal matrix which has the same diagonal components as A. Then the reader will verify that we get a surjective homomorphism (^) T � D given by A � diag(A).

The kernel of this homomorphism is precisely U. More generally , observe that for r > 2 , the set Nr- I consists of all matrices of the form 0 0 0 a (^) Ir.....^ a^ In 0 0 0 0 a (^2) ,r+ 1 a 2 n

M - 0 0................^ an -r+ l ,n 0 0................^0

0 0................^0

Let Ur = I +^ N r. Then U1 U and Ur :J Ur + 1 • Furthermore, Ur + 1 is normal

in Un and the factor group is isomorphic to the additive group ( !) k11 -^ r, under the

the mapping which sends I + M to the n - r-tuple (a lr+ l '... , an -r,n ) E kn- r. This n - r-tuple could be called the r-th upper diagonal. Thus we obtain an abelian tower

Theorem 3.2.^ Let^ G^ be a group and^ H^ a normal subgroup. Then^ G^ is solvable if and only if H and G /H are solvable.

Proof. We prove that G solvable implies that H is solvable. Let G = G0 :J G I :J... :J Gr = {e} be a tower of groups with G; + I normal in G; and such that G; /G;+ I is abelian. Let H; = H n G;. Then H;+ I is normal in H;, and we have an embedding H;/H;+ t (^) � G;/G;+ (^) I , whence H; /H;+ t is abelian , whence proving that H is solvable. We leave the proofs of the other statements to the reader.

I, §

u( U n V )

u

u

u n v

NORMAL SUBGROUPS 21

v

( U n V ) v

v

u n v

In this diagram, we are given U, u, V, v. All the other points in the diagram

correspond to certain groups which can be determined as follows. The inter section of two line segments going downwards represents the intersection of groups. Two lines going upwards meet in a point which represents the product of two subgroups (i.e. the smallest subgroup containing both of them). We consider the two parallelograms representing the wings of the butterfly , and we shall give isomoqJhisms of the factor groups as follows:

----^ u(^ u n^ V)^ ,....... ------u n^ v^ = (....;_^ u n^ V)v u(U n v) (u n V)( U n v) (u n V)v.

In fact, the vertical side common to both parallelograms has U n V as its
top end point, and (u n V)( U n v) as its bottom end point. We have an iso

morphism

( U n V)/(u n V)( U n v) � u( U n V)/u( U n v).

This is obtained from the isomorphism theorem

H/(H n N) � HN/N

by setting H = U (^) n V and N = u( U n v). This gives us the isomorphism on the left. By symmetry we obtain the corresponding isomorphism on the right , which proves the Butterfly lemma.

Let G be a group, and let
G = G t ::J G2 ::J · · · ::J Gr = {e},
G = Ht ::J H2 ::J · · · ::J Hs = {e}

be normal towers of subgroups, ending with the trivial group. We shall say that these towers are equivalent if r = s and if there exists a permutation of the

22 G ROU PS

indices i = 1 ,^... , r - 1, written i t-+ i', such that
Gi/Gi + 1 � Hi·/Hi' +^ 1·

I , §

In other words, the sequences of factor groups in our two towers are the same, up to isomorphisms, and a permutation of the indices.

Theorem 3.4. (Schreier) Let G be a group. Two normal towers of subgroups

ending with the trivial group have equivalent refinements.

Proof Let the two towers be as above. For each i = 1 ,... , r - 1 and
j = 1 ,... , s we define
G. · =l) G. +I 1(H J· n G ·)I •
Then Gis = Gi + 1, and we have a refinement of the first tower :
G = G 1 1 ::J G 1 2 ::J • • • ::J^ G 1 , s - 1 =>^ G
= G2 1 ::J G22 ::J • • • ::J Gr- 1 , 1 :::> • • • ::J^ Gr - l , s - 1 ::J^ {e}.

Similarly, we define

H J l.. = H J· + 1(G ·l n H J·) '
for j = 1,... , s - 1 and i = 1,... , r. This yields a refinement of the second
tower. By the butterfly lemma, for i = 1 , ... , r - 1 and j = 1 ,... , s - 1 we

have isomorphisms

Gii/Gi, i + 1 �^ Hii/Hi, i +^ 1 ·
We view each one of our refined towers as having (r - 1)(s - 1 ) + 1 elements,
namely Gii (i = 1,... , r - 1 ; j = 1, ... , s - 1 ) and {e}^ in the first case, Hii and
{ e} in the second case. The preceding isomorphism for each pair of indices
( i, j) shows that our refined towers are equivalent, as was to be proved.
A group G is said to be simple if it is non-trivial, and has no normal sub
groups other than { e} and G itself.
Theorem 3.5. (Jordan-Holder) Let G^ be^ a^ group, and let
G = G 1 => G2 ::J • • • => Gr = {e}
be a normal tower such that each group Gi/Gi + 1 is simple, and Gi #= Gi + 1
for i = 1 ,... , r - 1. Then any other normal tower ofG having the same prop
erties is equivalent to this one.
Proof Given any refinement {Gii} as before for our tower, we observe
that for each i, there exists precisely one index j such that Gi/Gi + 1 = Gii/Gi, i + 1.

Thus the sequence of non-trivial factors for the original tower, or the refined tower, is the same. This proves our theorem.