Datasets:

Modalities:
Text
Languages:
English
Libraries:
Datasets
License:
proof-pile / books /cring /threeimportantfunctors.tex
zhangir-azerbayev
added books
afd65d6
raw
history blame
102 kB
\chapter{Three important functors}
There are three functors that will be integral to our study of commutative
algebra in the future: localization, the tensor product, and $\hom$.
While localization is an \emph{exact} functor, the tensor product and $\hom$
are not. The failure of exactness in those cases leads to the theory of
flatness and projectivity (and injectivity), and eventually the \emph{derived functors}
$\mathrm{Tor}$ and $\mathrm{Ext}$ that crop up in commutative algebra.
\section{Localization}
Localization is the process of making invertible a collection of elements in a
ring. It is a generalization of the process of forming a quotient field of an
integral domain.
\subsection{Geometric intuition}
We first start off with some of the geometric intuition behind the idea of
localization. Suppose we have a Riemann surface $X$ (for example, the Riemann
sphere). Let $A(U)$ be the ring of holomorphic functions over some neighborhood
$U\subset X$. Now, for holomorphicity to hold, all that is required is
that a function doesn't have a pole inside of $U$, thus when $U=X$, this
condition is the strictest and as $U$ gets smaller functions begin to show
up that may not arise from the restriction of a holomorphic function over
a larger domain. For example, if we want to study holomorphicity ``near a
point $z_0$'' all that we should require is that the function doesn't pole at
$z_0$. This means that we should consider quotients of holomorphic functions
$f/g$ where $g(z_0)\neq 0$. This process of inverting a collection of elements
is expressed through the algebraic construction known as ``localization.''
\subsection{Localization at a multiplicative subset}
Let $R$ be a commutative ring.
We start by constructing the notion of \emph{localization} in the most general
sense.
We have already implicitly used this definition, but nonetheless, we make it
formally:
\begin{definition} \label{multset}
A subset $S \subset R$ is a \textbf{multiplicative subset} if $1 \in S$ and
if $x,y \in S$ implies $xy \in S$.
\end{definition}
We now define the notion of \emph{localization}. Formally, this means
inverting things.
This will give us a functor from $R$-modules to $R$-modules.
\begin{definition}
If $M$ is an $R$-module, we define the module $S^{-1}M$ as the set of formal
fractions
\[ \left\{m/s, m \in M, s \in S\right\} \]
modulo an equivalence relation: where $m/s \sim m'/s'$ if and only if
\[ t( s'm - m's ) = 0 \]
for some $t \in S$. The reason we need to include the $t$ in the definition
is that otherwise the
relation would not be transitive (i.e. would not be an
equivalence relation).
\end{definition}
So two fractions agree if they agree when clearing denominators and
multiplication.
It is easy to check that this is indeed an equivalence relation. Moreover
$S^{-1}M$ is an abelian group with the usual addition of fractions
\[ \frac{m}{s}+\frac{m'}{s'} = \frac{s'm + sm'}{ss'} \]
and it is easy to check that this is a legitimate abelian group.
\begin{definition}
Let $M$ be an $R$-module and $S \subset R$ a multiplicative subset.
The abelian group $S^{-1}M$ is naturally an $R$-module. We define
\[ x(m/s) = (xm)/s, \quad x \in R. \]
It is easy to check that this is well-defined and makes it into a module.
Finally, we note that localization is a \emph{functor} from the category of
$R$-modules to itself. Indeed, given $f: M \to N$, there is a naturally
induced map $S^{-1}M \stackrel{S^{-1}f}{\to} S^{-1}N$.
\end{definition}
We now consider the special case when the localized module is the initial ring
itself.
Let $M = R$. Then $S^{-1}R$ is an $R$-module, and it is in fact a commutative
ring in its own right. The ring structure is quite tautological:
\[ (x/s)(y/s') = (xy/ss'). \]
There is a map $R \to S^{-1}R$ sending $x \to x/1$, which is a
ring-homomorphism.
\begin{definition}
For $S \subset R$ a multiplicative set, the localization $S^{-1}R$ is a
commutative ring as above. In fact, it is an $R$-algebra; there is a natural
map $\phi: R \to S^{-1}R$ sending $r \to r/1$.
\end{definition}
We can, in fact, describe $\phi: R \to S^{-1}R$ by a \emph{universal
property}. Note
that for each $s \in S$, $\phi(s)$ is invertible. This is because $\phi(s) =
s/1$ which has a multiplicative inverse $1/s$. This property characterizes
$S^{-1}R$.
For any commutative ring $B$, $\hom(S^{-1}R, B)$ is naturally isomorphic to the
subset of $\hom(R,B)$ that send $S$ to units. The map takes $S^{-1}R \to B$ to
the pull-back $R \to S^{-1}R \to B$. The proof of this is very simple.
Suppose that $f: R \to B$ is such that $f(s) \in B$ is invertible for each $s
\in S$. Then we must define $S^{-1}R \to B$ by sending $r/s$ to
$f(r)f(s)^{-1}$. It is easy to check that this is well-defined and that the
natural isomorphism as claimed is true.
Let $R$ be a ring, $M$ an $R$-module, $S \subset R$ a multiplicatively closed
subset. We defined a ring of fractions $S^{-1}R$ and an $R$-module $S^{-1}M$.
But in fact this is a module over the ring $S^{-1}R$.
We just multiply $(x/t)(m/s) = (xm/st)$.
In particular, localization at $S$ gives a \emph{functor} from $R$-modules to
$S^{-1}R$-modules.
\begin{exercise}
Let $R$ be a ring, $S$ a multiplicative subset. Let $T$ be the $R$-algebra
$R[\left\{x_s\right\}_{s \in S}]/( \left\{sx_s - 1\right\})$. This is the
polynomial ring in the variables $x_s$, one for each $s \in S$, modulo the
ideal generated by $sx_s = 1$. Prove that this $R$-algebra is naturally
isomorphic to $S^{-1}R$, using the universal property.
\end{exercise}
\begin{exercise} Define a functor $\mathbf{Rings} \to \mathbf{Sets}$ sending
a ring to
its set of units, and show that it is corepresentable (use $\mathbb{Z}[X,
X^{-1}]$).
\end{exercise}
\subsection{Local rings}
A special case of great importance in the future is when the multiplicative
subset is the complement of a prime ideal, and we study this in the present
subsection. Such localizations will be ``local rings'' and geometrically
correspond to the process of zooming at a point.
\begin{example}
Let $R$ be an integral domain and let $S = R - \left\{0\right\}$. This is a
multiplicative subset because $R$ is a domain. In this case, $S^{-1}R$ is just
the ring of fractions by allowing arbitrary nonzero denominators; it is a
field, and is called the \textbf{quotient field}. The most familiar example is
the construction of $\mathbb{Q}$ as the quotient field of $\mathbb{Z}$.
\end{example}
We'd like to generalize this example.
\begin{example}
Let $R$ be arbitrary and $\mathfrak{p}$ is a prime ideal. This means that $1
\notin \mathfrak{p}$ and $x,y \in R - \mathfrak{p}$ implies that $xy \in R -
\mathfrak{p}$. Hence, the complement $S = R- \mathfrak{p}$ is multiplicatively
closed. We get a ring $S^{-1}R$.
\begin{definition}
This ring is denoted $R_{\mathfrak{p}}$ and is called the \textbf{localization
at $\mathfrak{p}$.} If $M$ is an $R$-module, we write $M_{\mathfrak{p}}$ for
the localization of $M$ at $R - \mathfrak{p}$.
\end{definition}
This generalizes the previous example (where $\mathfrak{p} = (0)$).
\end{example}
There is a nice property of the rings $R_{\mathfrak{p}}$. To elucidate this,
we start with a lemma.
\begin{lemma}
Let $R$ be a nonzero commutative ring. The following are equivalent:
\begin{enumerate}
\item $R$ has a unique maximal ideal.
\item If $x \in R$, then either $x$ or $1-x$ is invertible.
\end{enumerate}
\end{lemma}
\begin{definition}
In this case, we call $R$ \textbf{local}. A local ring is one with a unique
maximal ideal.
\end{definition}
\begin{proof}[Proof of the lemma]
First we prove $(2) \implies (1)$.
Assume $R$ is such that for
each $x$, either $x$ or $1-x$ is invertible. We will find the maximal ideal.
Let $\mathfrak{M} $ be the collection of noninvertible elements of $R$. This is
a subset of $R$, not containing $1$, and it is closed under multiplication.
Any proper ideal must be a subset of $\mathfrak{M}$, because otherwise that
proper ideal would contain an invertible element.
We just need to check that $\mathfrak{M}$ is closed under addition.
Suppose to the
contrary that $x, y \in \mathfrak{M}$ but $x+y$ is invertible. We get (with
$a = x/(x+y)$)
\[ 1 = \frac{x}{x+y} + \frac{y}{x+y} =a+(1-a). \]
Then one of $a,1-a$ is invertible. So either $x(x+y)^{-1}$ or $y(x+y)^{-1}$ is
invertible, which implies that either $x,y$ is invertible, contradiction.
Now prove the reverse direction. Assume $R$ has a unique maximal ideal
$\mathfrak{M}$. We claim that $\mathfrak{M}$ consists precisely of the
noninvertible elements. To see this, first note that $\mathfrak{M}$
can't contain any invertible elements since it is proper. Conversely, suppose
$x$ is not invertible, i.e. $(x) \subsetneq R$. Then $(x)$ is contained in a
maximal ideal by \rref{anycontainedinmaximal}, so $(x) \subset
\mathfrak{M}$ since $\mathfrak{M}$ is unique among maximal ideals.
Thus $x \in \mathfrak{M}$.
Suppose $x \in R$; we can write $1 = x + (1-x)$. Since $1 \notin \mathfrak{M}$,
one of $x, 1-x$ must not be in $\mathfrak{M}$, so one of those must not be
invertible. So $(1) \implies (2)$. The lemma is proved.
\end{proof}
Let us give some examples of local rings.
\begin{example}
Any field is a local ring because the unique maximal ideal is $(0)$.
\end{example}
\begin{example}
Let $R$ be any commutative ring and $\mathfrak{p}\subset R$ a prime ideal. Then
$R_{\mathfrak{p}}$ is a local ring.
We state this as a result.
\begin{proposition}
$R_{\mathfrak{p}}$ is a local ring if $\mathfrak{p}$ is prime.\end{proposition}
\begin{proof}
Let $\mathfrak{m} \subset R_{\mathfrak{p}}$ consist of elements $x/s$ for $x
\in \mathfrak{p}$ and $s \in R - \mathfrak{p}$. It is left as an exercise
(using the primality of $\mathfrak{p}$) to
the reader to see that whether the numerator belongs to $\mathfrak{p}$ is
\emph{independent} of the representation $x/s$ used for it.
Then I claim that $\mathfrak{m}$ is the
unique maximal ideal. First, note that $\mathfrak{m}$ is
an ideal; this is evident since the numerators form an ideal. If $x/s, y/s'$
belong to $\mathfrak{m}$ with appropriate expressions, then
the numerator of
\[ \frac{xs'+ys}{ss'} \]
belongs to $\mathfrak{p}$, so this sum belongs to $\mathfrak{m}$. Moreover,
$\mathfrak{m}$ is a proper ideal because $\frac{1}{1}$ is not of the
appropriate form.
I claim that $\mathfrak{m}$ contains all other proper ideals, which will imply
that it is the unique maximal ideal. Let $I \subset R_{\mathfrak{p}}$ be any
proper ideal. Suppose $x/s \in I$. We want to prove $x/s \in \mathfrak{m}$.
In other words, we have to show $x \in \mathfrak{p}$. But if not $x/s$ would be
invertible, and $I = (1)$, contradiction. This proves locality.
\end{proof}
\end{example}
\begin{exercise}
Any local ring is of the form $R_{\mathfrak{p}}$ for some ring $R$ and for
some prime ideal $\mathfrak{p} \subset R$.
\end{exercise}
\begin{example}
Let $R = \mathbb{Z}$. This is not a local ring; the maximal ideals are given by
$(p)$ for $p$ prime. We can thus construct the localizations
$\mathbb{Z}_{(p)}$ of all fractions $a/b \in \mathbb{Q}$ where $b \notin (p)$.
Here $\mathbb{Z}_{(p)}$ consists of all rational numbers that don't have
powers of $p$ in the denominator.
\end{example}
\begin{exercise}
A local ring has no idempotents other than $0$ and $1$. (Recall that $e \in R$
is \emph{idempotent} if $e^2 = e$.) In particular, the product of two rings is
never local.
\end{exercise}
It may not yet be clear why localization is such a useful process. It turns
out that many problems can be checked on the localizations at prime (or even
maximal) ideals, so certain proofs can reduce to the case of a local ring.
Let us give a small taste.
\begin{proposition}
Let $f: M \to N$ be a homomorphism of $R$-modules. Then $f$ is injective if
and only if for every maximal ideal $\mathfrak{m} \subset R$, we have that
$f_{\mathfrak{m}}: M_{\mathfrak{m}} \to N_{\mathfrak{m}}$ is injective.
\end{proposition}
Recall that, by definition, $M_{\mathfrak{m}}$ is the localization at $R -
\mathfrak{m}$.
There are many variants on this (e.g. replace with surjectivity, bijectivity).
This is a general observation that lets you reduce lots of commutative algebra
to local rings, which are easier to work with.
\begin{proof}
Suppose first that each $f_{\mathfrak{m}}$ is injective. I claim that $f$ is
injective. Suppose $x \in M - \left\{0\right\}$. We must show that $f(x) \neq
0$. If $f(x)=0$, then $f_{\mathfrak{m}}(x)=0$ for every maximal ideal
$\mathfrak{m}$. Then by
injectivity it follows that $x$ maps to zero in each $M_{\mathfrak{m}}$.
We would now like to get a contradiction.
Let $I = \left\{ a \in R: ax = 0 \in M \right\}$. This is proper since $x \neq
0$. So $I$ is contained in some maximal ideal $\mathfrak{m}$. Then $x$
maps to zero in $M_{\mathfrak{m}}$ by the previous paragraph; this means that
there is $s \in R - \mathfrak{m}$ with $sx = 0 \in M$. But $s \notin I$,
contradiction.
Now let us do the other direction. Suppose $f$ is injective and $\mathfrak{m}$
a maximal ideal; we prove $f_{\mathfrak{m}}$ injective. Suppose
$f_{\mathfrak{m}}(x/s)=0 \in N_{\mathfrak{m}}$. This means that $f(x)/s=0$ in
the localized module, so that $f(x) \in M$ is killed by some $t \in R -
\mathfrak{m}$. We thus have $f(tx) = t(f(x)) = 0 \in M$. This means that $tx
= 0 \in M$ since $f$ is injective. But this in turn means that $x/s = 0 \in
M_{\mathfrak{m}}$. This is what we wanted to show.
\end{proof}
\subsection{Localization is exact}
Localization is to be thought of as a very mild procedure.
The next result says how inoffensive localization is. This result is a key
tool in reducing problems to the local case.
\begin{proposition}
Suppose $f: M \to N, g: N \to P$ and $M \to N \to P$ is exact. Let $S \subset
R$ be multiplicatively closed. Then
\[ S^{-1}M \to S^{-1}N \to S^{-1}P \]
is exact.
\end{proposition}
Or, as one can alternatively express it, localization is an \emph{exact
functor.}
Before proving it, we note a few corollaries:
\begin{corollary}
If $f: M \to N$ is surjective, then $S^{-1}M \to S^{-1}N$ is too.
\end{corollary}
\begin{proof}
To say that $A \to B$ is surjective is the same as saying that $A \to B \to 0$
is exact. From this the corollary is evident.
\end{proof}
Similarly:
\begin{corollary}
If $f: M \to N$ is injective, then $S^{-1}M \to S^{-1}N$ is too.
\end{corollary}
\begin{proof}
To say that $A \to B$ is injective is the same as saying that $0 \to A \to B $
is exact. From this the corollary is evident.
\end{proof}
\begin{proof}[Proof of the proposition] We adopt the notation of the
proposition.
If the composite $g\circ f$ is zero, clearly the localization $S^{-1}M \to
S^{-1}N \to S^{-1}P$ is zero too.
Call the maps $S^{-1}M \to S^{-1}N, S^{-1}N \to S^{-1}P$ as $\phi, \psi$. We
know that $\psi \circ \phi = 0$ so $\ker(\psi) \supset \im(\phi)$. Conversely,
suppose something belongs to $\ker(\psi). $ This can be written as a fraction
\[ x/s \in \ker(\psi) \]
where $x \in N, s \in S$. This is mapped to
\[ g(x)/s \in S^{-1}P, \]
which we're assuming is zero. This means that there is $t \in S$ with $tg(x) =
0 \in P$. This means that $g(tx)=0$ as an element of $P$. But $tx \in N$ and
its image of $g$ vanishes, so $tx$ must come from something in $M$. In
particular,
\[ tx = f(y) \ \text{for some} \ y \in M. \]
In particular,
\[ \frac{x}{s} = \frac{tx}{ts} = \frac{f(y)}{ts} = \phi( y/ts) \in \im(\phi).
\]
This proves that anything belonging to the kernel of $\psi$ lies in
$\im(\phi)$.
\end{proof}
\subsection{Nakayama's lemma}
We now state a very useful criterion for determining when a module over a
\emph{local} ring is zero.
\begin{lemma}[Nakayama's lemma] \label{nakayama} If $R$ is a local ring with
maximal ideal
$\mathfrak{m}$. Let $M$ be a finitely generated $R$-module. If
$\mathfrak{m}M = M$, then $M = 0$.
\end{lemma}
Note that $\mathfrak{m}M$ is the submodule generated by products of
elements of $\mathfrak{m}$ and $M$.
\begin{remark}
Once one has the theory of the tensor product, this equivalently states that
if $M$ is finitely generated, then
\[ M \otimes_R R/\mathfrak{m} = M/\mathfrak{m}M \neq 0. \]
So to prove that a finitely generated module over a local ring is zero, you
can reduce to studying the reduction to $R/\mathfrak{m}$. This is thus a very
useful criterion.
\end{remark}
Nakayama's lemma highlights why it is so useful to work over a local ring.
Thus, it is useful to reduce questions about general rings to questions about
local rings.
Before proving it, we note a corollary.
\begin{corollary}
Let $R$ be a local ring with maximal ideal $\mathfrak{m}$, and $M$ a finitely
generated module. If $N \subset M$ is a submodule such that $N +
\mathfrak{m}N =
M$, then $N=M$.
\end{corollary}
\begin{proof}
Apply Nakayama above (\cref{nakayama}) to $M/N$.
\end{proof}
We shall prove more generally:
\begin{proposition}
Suppose $M$ is a finitely generated $R$-module, $J \subset R$ an ideal.
Suppose $JM = M$. Then there is $a \in 1+J$ such that $aM = 0$.
\end{proposition}
If $J$ is the maximal ideal of a local ring, then $a$ is a unit, so that $M=0$.
\begin{proof}
Suppose $M$ is generated by $\left\{x_1, \dots, x_n\right\} \subset M$. This
means that every element of $M$ is a linear combination of elements of
$x_i$. However, each $x_i \in JM$ by assumption. In particular, each
$x_i$ can be written as
\[ x_i = \sum a_{ij} x_j, \ \mathrm{where} \ a_{ij} \in \mathfrak{m}. \]
If we let $A$ be the matrix $\left\{a_{ij}\right\}$, then $A$ sends the
vector $(x_i)$ into itself. In particular, $I-A$ kills
the vector $(x_i)$.
Now $I-A$ is an $n$-by-$n$ matrix in the ring $R$. We could, of course,
reduce everything modulo $J$ to get the identity; this is
because $A$ consists of elements of $J$. It follows that the
determinant must be congruent to $1$ modulo $J$.
In particular, $a=\det (I - A)$ lies in $1+J$.
Now by familiar linear algebra, $aI$ can be represented as the product of $A$
and the matrix of cofactors; in particular, $aI$ annihilates the vector
$(x_i)$, so that $aM=0$.
\end{proof}
Before returning to the special case of local rings, we observe the following
useful fact from ideal theory:
\begin{proposition} \label{idempotentideal}
Let $R$ be a commutative ring, $I \subset R$ a finitely generated ideal such that $I^2 = I$.
Then $I$ is generated by an idempotent element.
\end{proposition}
\begin{proof}
We know that there is $x \in 1+I$ such that $xI =0$. If $x = 1+y, y \in I$, it
follows that
\[ yt = t \]
for all $t \in I$. In particular, $y$ is idempotent and $(y) = I$.
\end{proof}
\begin{exercise}
\rref{idempotentideal} fails if the ideal is not finitely generated.
\end{exercise}
\begin{exercise}
Let $M$ be a finitely generated module over a ring $R$. Suppose $f: M \to M$
is a surjection. Then $f$ is an isomorphism. To see this, consider $M$ as a
module over $R[t]$ with $t$ acting by $f$; since $(t)M = M$, argue that there
is a polynomial $Q(t) \in R[t]$ such that $Q(t)t$ acts as the identity on
$M$, i.e. $Q(f)f=1_M$.
\end{exercise}
\begin{exercise}
Give a counterexample to the conclusion of Nakayama's lemma when the module is
not finitely generated.
\end{exercise}
\begin{exercise}
Let $M$ be a finitely generated module over the ring $R$. Let $\mathfrak{I}$
be the Jacobson
radical of $R$ (cf. \rref{Jacobson}). If $\mathfrak{I} M = M$,
then $M =
0$.
\end{exercise}
\begin{exercise}[A converse to Nakayama's lemma]
Suppose conversely that $R$ is a ring, and $\mathfrak{a} \subset R$ an ideal
such that $\mathfrak{a} M \neq M$ for every nonzero finitely generated
$R$-module. Then $\mathfrak{a}$ is contained in every maximal ideal of $R$.
\end{exercise}
\begin{exercise}
Here is an alternative proof of Nakayama's lemma. Let $R$ be local with
maximal ideal $\mathfrak{m}$, and let $M$ be a finitely generated module with
$\mathfrak{m}M = M$. Let $n$ be the minimal number of generators for $M$. If
$n>0$, pick generators $x_1, \dots, x_n$. Then write $x_1 = a_1 x_1 + \dots +
a_n x_n$ where each $a_i \in \mathfrak{m}$. Deduce that $x_1$ is in the
submodule generated by the $x_i, i \geq 2$, so that $n$ was not actually
minimal, contradiction.
\end{exercise}
Let $M, M'$ be finitely generated modules over a local ring $(R,
\mathfrak{m})$, and let $\phi: M \to M'$ be a homomorphism of modules. Then
Nakayama's lemma gives a criterion for $\phi$ to be a surjection: namely, the
map $\overline{\phi}: M/\mathfrak{m}M \to M'/\mathfrak{m}M'$ must be a surjection.
For injections, this is false. For instance, if $\phi$ is multiplication by any element of
$\mathfrak{m}$, then $\overline{\phi}$ is zero but $\phi$ may yet be injective.
Nonetheless, we give a criterion for a map of \emph{free} modules over a local ring to
be a \emph{split} injection.
\begin{proposition} \label{splitcriterion1}
Let $R$ be a local ring with maximal ideal $\mathfrak{m}$. Let $F, F'$ be two
finitely generated free $R$-modules, and let $\phi: F \to F'$ be a homomorphism.
Then $\phi$ is a split injection if and only if the reduction $\overline{\phi}$
\[ F/\mathfrak{m}F \stackrel{\overline{\phi}}{\to} F'/\mathfrak{m}F' \]
is an injection.
\end{proposition}
\begin{proof}
One direction is easy. If $\phi$ is a split injection, then it has a left
inverse
$\psi: F' \to F$ such that $\psi \circ \phi = 1_F$. The reduction of $\psi$ as a
map $F'/\mathfrak{m}F' \to F/\mathfrak{m}F$ is a left inverse to
$\overline{\phi}$, which is thus injective.
Conversely, suppose $\overline{\phi}$ injective. Let $e_1, \dots, e_r$ be a
``basis'' for $F$, and let $f_1, \dots, f_r$ be the images under $\phi$ in
$F'$. Then the reductions $\overline{f_1}, \dots, \overline{f_r}$ are linearly
independent in the $R/\mathfrak{m}$-vector space $F'/\mathfrak{m}F'$. Let us
complete this to a basis of $F'/\mathfrak{m}F'$ by adding elements
$\overline{g_1}, \dots, \overline{g_s} \in F'/\mathfrak{m}F'$, which we can
lift to elements $g_1, \dots, g_s \in F'$. It is clear that $F'$ has rank $r+s $
since its reduction $F'/\mathfrak{m}F'$ does.
We claim that the set $\left\{f_1, \dots, f_r, g_1, \dots, g_s\right\}$ is a
basis for $F'$. Indeed, we have a map
\[ R^{r+s} \to F' \]
of free modules of rank $r+s$. It can be expressed as an $r+s$-by-$r+s$ matrix
$M$; we need to show that $M$ is invertible. But if we reduce modulo
$\mathfrak{m}$, it is invertible since the reductions of $f_1, \dots, f_r,
g_1, \dots, g_s$ form a basis of $F'/\mathfrak{m}F'$.
Thus the determinant of $M$ is not in $\mathfrak{m}$, so by locality it is
invertible.
The claim about $F'$ is thus proved.
We can now define the left inverse $F' \to F$ of $\phi$. Indeed, given $x \in F'$,
we can write it uniquely as a linear combination $\sum a_i f_i + \sum b_j g_j$
by the above. We define $\psi(\sum a_i f_i + \sum b_j g_j) = \sum a_i e_i \in
F$. It is clear that this is a left inverse
\end{proof}
We next note a slight strenghtening of the above result, which is sometimes
useful. Namely, the first module does not have to be free.
\begin{proposition}
Let $R$ be a local ring with maximal ideal $\mathfrak{m}$. Let $M, F$ be two
finitely generated $R$-modules with $F$ free, and let $\phi: M \to F'$ be a homomorphism.
Then $\phi$ is a split injection if and only if the reduction $\overline{\phi}$
\[ M/\mathfrak{m}M \stackrel{\overline{\phi}}{\to} F/\mathfrak{m}F \]
is an injection.
\end{proposition}
It will in fact follow that $M$ is itself free, because $M$ is projective (see
\cref{} below) as it is a direct summand of a free module.
\begin{proof}
Let $L$ be a ``free approximation'' to $M$.
That is, choose a basis $\overline{x_1}, \dots, \overline{x_n}$ for $M/\mathfrak{m}M$ (as an $R/\mathfrak{m}$-vector
space) and lift this to elements $x_1, \dots, x_n \in M$. Define a map
\[ L = R^n \to M \]
by sending the $i$th basis vector to $x_i$.
Then $L/\mathfrak{m} L \to M/\mathfrak{m}M$ is an isomorphism.
By Nakayama's lemma,
$L \to M$ is surjective.
Then the composite map
$L \to M \to F$ is such that the $L/\mathfrak{m}L \to F/\mathfrak{m}F$ is injective, so
$L \to F$ is a split injection (by \cref{splitcriterion1}).
It follows that we can find a splitting $F \to L$, which when composed with $L
\to M$ is a splitting of $M \to F$.
\end{proof}
\begin{exercise}
Let $A$ be a local ring, and $B$ a ring which is finitely generated and free as an
$A$-module. Suppose $A \to B$ is an injection. Then $A \to B$ is a \emph{split
injection.} (Note that any nonzero morphism mapping out of a field is
injective.)
\end{exercise}
\section{The functor $\hom$}
In any category, the morphisms between two objects form a
set.\footnote{Strictly speaking, this may depend on your set-theoretic
foundations.} In many
categories, however, the hom-sets have additional structure. For instance,
the hom-sets
between abelian groups are themselves abelian groups. The same situation holds
for the category of modules over a commutative ring.
\begin{definition}
Let $R$ be a commutative ring and $M,N$ to be $R$-modules. We write
$\hom_R(M,N)$ for
the set of all $R$-module homomorphisms $M \to N$.
$\hom_R(M,N)$ is an $R$-module because one can add homomorphisms $f,g: M
\to N$ by adding
them pointwise: if $f,g$ are homomorphisms $M \to N$, define $f+g: M \to N$ via
\( (f+g)(m) = f(m)+g(m); \)
similarly, one can multiply homomorphisms $f: M \to N$ by elements $ a \in
R$: one sets
\( (af)(m) = a(f(m)). \)
\end{definition}
Recall that in any category, the hom-sets are \emph{functorial}. For instance,
given $f: N \to N'$, post-composition with $f$ defines a map $\hom_R(M,N) \to
\hom_R(M,N')$ for any $M$.
Similarly precomposition gives a natural map $\hom_R(N', M) \to \hom_R(N, M)$.
In particular, we get a bifunctor $\hom$, contravariant in the first variable
and covariant in the second, of $R$-modules into $R$-modules.
\subsection{Left-exactness of $\hom$}
We now discuss the exactness properties of this construction of forming
$\hom$-sets. The following result is basic and is, in fact, a reflection of
the universal property of the kernel.
\begin{proposition} \label{homcovleftexact}
If $M$ is an $R$-module, then the functor
\[ N \to \hom_R(M,N) \]
is left exact (but \emph{not exact} in general).
\end{proposition}
This means that if
\[ 0 \to N' \to N \to N'' \]
is exact,
then
\[ 0 \to \hom_R(M, N') \to \hom_R(M, N) \to \hom_R(M, N'') \]
is exact as well.
\begin{proof}
First, we have to show that the map
$\hom_R(M,N') \to \hom_R(M,N)$ is injective; this is because $N' \to N$ is
injective, and composition with $N' \to N$ can't kill any nonzero $M \to N'$.
Similarly, exactness in the middle can be checked easily, and follows from
\rref{univpropertykernel}; it states simply that a map $M \to N$ has
image landing inside $N'$ (i.e. factors through $N'$) if and only if it
composes to zero in $N''$.
\end{proof}
\newcommand{\ol}[1]{\mathbf{#1}}
This functor $\hom_R(M, \cdot)$ is not exact in general. Indeed:
\begin{example}
Suppose $R = \mathbb{Z}$, and consider the $R$-module (i.e. abelian group)
$M = \mathbb{Z}/2\mathbb{Z}$. There is a short exact
sequence
\[ 0 \to 2\mathbb{Z} \to \mathbb{Z} \to \mathbb{Z}/2\mathbb{Z} \to 0. \]
Let us apply $\hom_R(M, \cdot)$. We get a \emph{complex}
\[ 0 \to \hom(\mathbb{Z}/2\mathbb{Z}, 2\mathbb{Z}) \to
\hom(\mathbb{Z}/2\mathbb{Z}, \mathbb{Z}) \to \hom(\mathbb{Z}/2\mathbb{Z},
\mathbb{Z}/2\mathbb{Z}) \to 0. \]
The second-to-last term is $\mathbb{Z}/2\mathbb{Z}$; everything else is
zero. Thus the sequence is not exact, and in particular the functor
$\hom_{\mathbb{Z}}(\mathbb{Z}/2, -)$ is not an exact functor.
\end{example}
We have seen that homming out of a module is left-exact. Now, we see the same
for homming \emph{into} a module.
\begin{proposition} \label{homcontleftexact}
If $M$ is a module, then $\hom_R(-,M)$ is a left-exact contravariant functor.
\end{proposition}
We write this proof in slightly more detail than \cref{homcovleftexact},
because of the
contravariance.
\begin{proof}
We want to show that $\hom(\cdot, M)$ is a left-exact contravariant functor,
which means that
if $ A \xrightarrow u B \xrightarrow v C \to 0$ is exact, then so is
$$
0 \to \hom(C, M) \xrightarrow{\ol v} \hom(B, M) \xrightarrow{\ol u} \hom(A, M)
$$
is exact. Here, the bold notation refers to the induced maps of $u,v$ on the
hom-sets: if $f \in \hom(B,M)$ and $g \in \hom(C, M)$, we define
$\ol u$ and $\ol v$ via $\ol v(g) = g \circ v$ and
$\ol u(f) = f \circ u$.
Let us show first that $\ol v$ is injective.
Suppose that $g \in \hom(C, M)$. If $\ol v(g) = g \circ v = 0$ then
$(g \circ v)(b) = 0$ for all $b \in B$. Since $v$ is a surjection, this means
that $g(C) = 0$ and hence $g = 0$. Therefore, $\ol v$ is injective, and we
have exactness at $\hom(C, M)$.
Since $v \circ u = 0$, it is clear that $\ol u \circ \ol u = 0$.
Now, suppose that $f \in \ker(\ol u) \subset \hom(B, M)$. Then
$\ol u(f) = f \circ u = 0$.
Thus $f: B \to M$ factors through $B/\im(u)$.
However, $\im(u) = \ker(v)$, so $f$ factors through $B/\ker(v)$.
Exactness shows that there is an isomorphism $B/\ker(v) \simeq C$.
In particular, we find that $f$ factors through $C$. This is what we wanted.
\end{proof}
\begin{exercise}
Come up with an example where $\hom_R(-, M)$ is not exact.
\end{exercise}
\begin{exercise}
Over a \emph{field}, $\hom$ is always exact.
\end{exercise}
\subsection{Projective modules}
Let $M$ be an $R$-module for a fixed commutative ring $R$. We have seen that
$\hom_R(M,-)$ is generally only a left-exact functor.
Sometimes, however, we do have exactness. We axiomatize this with the
following.
\begin{definition} \label{projectives}
An $R$-module $M$ is called \textbf{projective} if the functor $\hom_R(M,
\cdot)$ is
exact.\footnote{It is possible to define a projective module over a
noncommutative ring. The definition is the same, except that the $\hom$-sets
are no longer modules, but simply abelian groups. }
\end{definition}
One may first observe that a free module is projective.
Indeed, let $F = R^I$ for an indexing set. Then the functor $N \to \hom_R(F,
N)$ is
naturally
isomorphic to $N \to N^I$. It is easy to see that this functor preserves
exact sequences (that is, if $0 \to A \to B \to C \to 0$ is exact, so is $0
\to A^I \to B^I \to C^I \to 0$).
Thus $F$ is projective.
One can also easily check that a \emph{direct summand} of a projective module
is projective.
It turns out that projective modules have a very clean characterization. They
are \emph{precisely} the direct
summands in free modules.
\add{check this}
\begin{proposition} \label{projmod}
The following are equivalent for an $R$-module $M$:
\begin{enumerate}
\item $M$ is projective.
\item Given any map $M \to N/N'$ from $M$ into a quotient of $R$-module
$N/N'$, we can lift
it to a map $M \to N$.
\item There is a module $M'$ such that $M \oplus M'$ is free.
\end{enumerate}
\end{proposition}
\begin{proof}
The equivalence of 1 and 2 is just unwinding the definition of projectivity,
because we just need to show that $\hom_R(M, \cdot)$ preserves surjective
maps, i.e. quotients. ($\hom_R(M, \cdot)$ is already left-exact, after all.)
To say that $\hom_R(M, N) \to \hom_R(M, N/N')$ is surjective is just the
statement that any map $M \to N/N'$
can be lifted to $M \to N$.
Let us show that 2 implies 3. Suppose $M$ satisfies 2. Then choose a
surjection $P \twoheadrightarrow M$ where $P$ is free, by
\cref{freesurjection}. Then we can
write $M \simeq P/P'$ for a submodule $P' \subset P$. The isomorphism map
$M \to P/P'$
leads by 2 to a lifting $M \to P$. In particular, there is a section of $P
\to M$,
namely this lifting. Since a section leads to a split exact sequence by
\cref{}, we find then that $P \simeq \ker(P \to M) \oplus \im(M \to P) \simeq
\ker(P \to M) \oplus M$,
verifying 3 since $P$ is free.
Now let us show that 3 implies 2.
Suppose $M \oplus M'$ is free, isomorphic to $P$. Then a map $M \to N/N'$ can
be extended to
\[ P \to N/N' \]
by declaring it to be trivial on $M'$. But now $P \to N/N'$ can be lifted to
$N$ because $P$ is free, and we have observed that a free module is
projective above; alternatively, we just lift the image of a basis. This
defines $P
\to N$. We may then compose this with the inclusion $M \to P$ to get the
desired map $M \to P \to N$,
which is a lifting of $M \to N/N'$.
\end{proof}
Of course, the lifting $P \to N$ of a given map $P \to N/N'$ is generally not
unique, and in fact is unique precisely when $\hom_R(P,N') = 0$.
So projective modules are precisely those with the following lifting property.
Consider a diagram
\[ \xymatrix{
& P \ar[d] \\
M \ar[r] & M'' \ar[r] & 0
}\]
where the bottom row is exact. Then, if $P$ is projective, there is a lifting
$P \to M$ making commutative the diagram
\[ \xymatrix{
& P \ar[d]\ar@{-->}[ld] \\
M \ar[r] & M'' \ar[r] & 0
}\]
\begin{corollary}
Let $M$ be a module. Then there is a surjection $P \twoheadrightarrow M$,
where $P$ is projective.
\end{corollary}
\begin{proof}
Indeed, we know (\rref{freesurjection}) that we can always get a surjection
from a free
module. Since free modules are projective by \rref{projmod}, we are
done.
\end{proof}
\begin{exercise}
Let $R$ be a principal ideal domain, $F'$ a submodule of a free module
$F$. Show that
$F'$ is free. (Hint: well-order the set of generators of $F$, and climb up by
transfinite induction.)
In particular, any projective modules is free.
\end{exercise}
\subsection{Example: the Serre-Swan theorem}
We now briefly digress to describe an important correspondence between
projective modules and vector bundles. The material in this section will not
be used in the sequel.
Let $X$ be a compact space. We shall not recall the topological notion of a
\emph{vector bundle} here.
We note, however, that if $E$ is a (complex) vector bundle,
then the set $\Gamma(X, E)$ of global sections is naturally a module over the
ring $C(X)$ of complex-valued continuous functions on $X$.
\begin{proposition}
If $E$ is a vector bundle on a compact Hausdorff space $X$, then there is a
surjection $\mathcal{O}^N \twoheadrightarrow E$ for some $N$.
\end{proposition}
Here $\mathcal{O}^N$ denotes the trivial bundle.
It is known that in the category of vector bundles, every epimorphism splits.
In particular, it follows that $E$ can be viewed as a \emph{direct summand} of
the bundle $\mathcal{O}^N$. Since $\Gamma(X, E)$ is then a direct summand of
$\Gamma(X, \mathcal{O}^N) = C(X)^N$, we find that $\Gamma(X, E)$ is a direct
summand of a projective $C(X)$-module. Thus:
\begin{proposition}
$\Gamma(X, E)$ is a finitely generated projective $C(X)$-module.
\end{proposition}
\begin{theorem}[Serre-Swan]
The functor $E \mapsto \Gamma(X, E)$ induces an equivalence of categories
between vector bundles on $X$ and finitely generated projective modules over
$C(X)$.
\end{theorem}
\subsection{Injective modules}
\label{ssecinj}
We have given a complete answer to the question of when the functor
$\hom_R(M,-)$ is exact. We have shown that there are a lot of such
\emph{projective} modules in the category of $R$-modules, enough that any
module admits a surjection from one such.
However, we now have to answer the dual question: when is the functor
$\hom_R(-, Q)$ exact?
Let us make the dual definition:
\begin{definition}
An $R$-module $Q$ is \textbf{injective} if the functor $\hom_R(-,Q)$ is exact.
\end{definition}
Thus, a module $Q$ over a ring $R$ is injective if
whenever $M \to N$ is an injection, and one has a map $M \to Q$, it can be
extended to $N \to Q$: in other words, $\hom_R(N,Q ) \to \hom_R(M,Q)$ is
surjective.
We can visualize this by a diagram
\[ \xymatrix{
0 \ar[r] & M \ar[r] \ar[d] & N \ar@{-->}[ld] \\
& Q
}\]
where the dotted arrow always exists if $Q$ is injective.
The notion is dual to projectivity, in some sense, so just as every module $M$
admits an epimorphic map $P \to M$ for $P$ projective, we expect by duality
that every module admits a monomorphic map $M \to Q$ for $Q$ injective.
This is in fact true, but will require some work.
We start, first, with a fact about injective abelian groups.
\begin{theorem}\label{divisibleimpliesinj}
A divisible abelian group (i.e. one where the map $x
\to nx$ for any $n \in \mathbb{N}$ is surjective) is injective as a
$\mathbb{Z}$-module (i.e. abelian group).
\end{theorem}
\begin{proof}
The actual idea of the proof is rather simple, and similar to the proof
of the Hahn-Banach theorem.
Namely, we extend bit by bit, and then use Zorn's lemma.
The first step is that we have a subgroup $M $ of a larger abelian
group $N$.
We have a map of $f:M \to Q$ for $Q$ some divisible abelian group, and we
want to extend it to $N$.
Now we can consider the poset of pairs $(\tilde{f}, M')$ where $M' \supset
M$, and $\tilde{f}: M' \to N$ is a map extending $f$.
Naturally, we make this into a poset by defining the order as ``$(\tilde{f},
M') \leq (\tilde{f}', M'')$ if $M'' $ contains $M'$ and $\tilde{f}'$
is an extension of $\tilde{f}$.
It is clear that every chain has an upper bound, so Zorn's lemma implies
that we have a submodule $M' \subset N$ containing $M$, and a map $\tilde{f}: M'
\to N$ extending $f$, such that there is no proper extension of $\tilde{f}$.
From this we will derive a contradiction unless $M' = N$.
So suppose we have $M' \neq N$, for $M'$ the maximal submodule to which $f$
can be extended, as in the above paragraph. Pick $m \in N - M'$, and consider the
submodule $M' + \mathbb{Z} m \subset N$. We are going to show how to extend $\tilde{f}$
to this bigger submodule. First, suppose $\mathbb{Z}m \cap M' = \{0\}$,
i.e. the sum is direct. Then we can extend $\tilde{f}$ because $M' +
\mathbb{Z}m$ is a
direct sum: just define it to be zero on $\mathbb{Z}m$.
The slightly harder part is what happens if $\mathbb{Z} m \cap M' \neq \{ 0\}$.
In this case, there is an ideal $I \subset \mathbb{Z}$ such that $n \in I$
if and only if $nm \in M'$.
This ideal, however, is principal; let $g \in \mathbb{Z} - \left\{0\right\}$ be a generator. Then $gm = p
\in M'$. In particular, $\tilde{f}(gm)$ is defined.
We can ``divide'' this
by $g$, i.e. find $u \in Q$ such that $gu = \tilde{f}(gm)$.
Now we may extend to a
map $\tilde{f}'$ from $\mathbb{Z} m + M'$ into $Q$ as follows. Choose $m'
\in M', k \in \mathbb{Z}$. Define $\tilde{f}'( m' + km) = \tilde{f}(m')
+ k u$. It is easy to see that this is well-defined by the choice of $u$,
and gives a proper extension of $\tilde{f}$. This contradicts maximality of
$M'$ and completes the proof.
\end{proof}
\begin{exercise}
\cref{divisibleimpliesinj} works over any principal ideal domain.
\end{exercise}
\begin{exercise}[Baer] \label{baercriterion}
Let $N$ be an $R$-module such that for any ideal $I \subset R$, any morphism
$I \to N$ can be extended to $R \to N$. Then $N$ is injective. (Imitate the
above argument.)
\end{exercise}
From this, we may prove:
\begin{theorem}
Any $R$-module $M$ can be imbedded in an injective $R$-module $Q$.
\end{theorem}
\begin{proof}
First of all, we know that any $R$-module $M$ is a quotient of a free
$R$-module. We are going to show that the {dual} (to be defined shortly) of a free module is injective. And so since
every module admits a surjection from a free module, we will use a dualization
argument to prove the present theorem.
First, for any abelian group $G$, define the \textbf{dual group} as $G^\vee
= \hom_{\mathbb{Z}}(G, \mathbb{Q}/\mathbb{Z})$.
Dualization is clearly a contravariant functor from abelian groups to abelian
groups.
By \cref{homcontleftexact}
and \cref{divisibleimpliesinj}, an exact
sequence of groups
\[ 0 \to A \to B \to C \to 0 \]
induces an exact sequence
\[ 0 \to C^\vee \to B^\vee \to A^\vee \to 0 .\]
In particular, dualization is an exact functor:
\begin{proposition} Dualization preserves exact sequences (but reverses
the order).
\end{proposition}
Now, we are going to apply this to $R$-modules. The dual of a left $R$-module
is acted upon by $R$.
The action, which is natural enough, is as follows. Let $M$ be an
$R$-module, and $f: M \to
\mathbb{Q}/\mathbb{Z}$ be a homomorphism of abelian groups (since
$\mathbb{Q}/\mathbb{Z}$ has in general no $R$-module structure), and $r \in
R$; then we define $rf$ to be the map $M \to \mathbb{Q}/\mathbb{Z}$ defined via
\[ (rf)(m) = f(rm).\]
It is easy to check that $M^{\vee}$ is thus made into an
$R$-module.\footnote{If $R$ is noncommutative, this would not work: instead
$M^{\vee}$ would be an \emph{right} $R$-module. For commutative rings, we have
no such distinction between left and right modules.}
In particular, dualization into $\mathbb{Q}/\mathbb{Z}$ gives a contravariant
exact functor from $R$-\emph{modules} to $R$-\emph{modules}.
Let $M$ be as before,
and now consider the $R$-module $M^{\vee}$. By \cref{freesurjection}, we can
find a free
module $F$ and a surjection
\[ F \to M^{\vee} \to 0.\]
Now dualizing gives an exact sequence of $R$-modules
\[ 0 \to M^{\vee \vee} \to F^{\vee}. \]
However, there is a natural map (of $R$-modules) $M \to M^{\vee \vee}$: given $m \in M$, we can
define a functional $\hom(M, \mathbb{Q}/\mathbb{Z}) \to \mathbb{Q}/\mathbb{Z}$
by evaluation at $m$. One can check that this is a homomorphism. Moreover, this morphism $M \to M^{\vee \vee}$ is actually injective: if $m \in M$ were
in the kernel, then by definition every functional $M \to
\mathbb{Q}/\mathbb{Z}$ must vanish on $m$. It is easy to see (using
$\mathbb{Z}$-injectivity of $\mathbb{Q}/\mathbb{Z}$) that this cannot happen
if $m \neq 0$: we could just pick a nontrivial functional on the monogenic
\emph{subgroup} $\mathbb{Z} m$ and extend to $M$.
We claim now that $F^{\vee}$ is injective. This will prove the theorem, as
we have the composite of monomorphisms $M \hookrightarrow M^{\vee \vee} \hookrightarrow F^{\vee}$ that
embeds $M$ inside an injective module.
\begin{lemma} The dual of a free $R$-module $F$ is an injective
$R$-module.
\end{lemma}
\begin{proof}
Let $0 \to A \to B $ be exact; we have to show that
\[ \hom_R( B, F^\vee) \to \hom_R(A, F^\vee) \to 0 .\]
is exact.
Now we can reduce to the case where $F$ is the $R$-module $R$ itself.
Indeed, $F$ is a direct sum of $R$'s by assumption, and taking hom's turns
them into direct products; moreover the direct product of
exact sequences is exact.
So we are reduced to showing that $R^{\vee}$ is injective.
Now we claim that
\begin{equation} \label{weirddualityexpr} \hom_R(B, R^{\vee}) =
\hom_{\mathbb{Z}}(B, \mathbb{Q}/\mathbb{Z}). \end{equation}
In particular, $\hom_R( -, R^\vee)$ is an exact functor because
$\mathbb{Q}/\mathbb{Z}$ is an injective abelian group.
The proof of \cref{weirddualityexpr} is actually ``trivial.'' For instance,
a $R$-homomorphism $f: B \to R^\vee$ induces $\tilde{f}: B \to
\mathbb{Q}/\mathbb{Z}$ by sending $b \to (f(b))(1)$. One checks that this
is bijective.
\end{proof}
\end{proof}
\subsection{The small object argument}
There is another, more set-theoretic approach to showing that any $R$-module
$M$ can be imbedded in an injective module.
This approach, which constructs the injective module by a transfinite
colimit of push-outs, is essentially analogous to the ``small object
argument'' that one uses in homotopy theory to show that certain categories
(e.g. the category of CW complexes) are model categories in the sense of
Quillen; see \cite{Ho07}.
While this method is somewhat abstract and more complicated than the one of
\cref{ssecinj}, it is also more general. Apparently this method originates with Baer,
and was revisited by Cartan and Eilenberg in
\cite{Cartan-Eilenberg} and by Grothendieck in \cite{Gr57}.
There Grothendieck uses it to show that
many other abelian categories have enough injectives.
We first begin with a few remarks on smallness.
Let $\{B_{\alpha}\}, \alpha \in \mathcal{A}$ be an inductive system of objects in some
category $\mathcal{C}$, indexed by
an ordinal $\mathcal{A}$. Let us assume that $\mathcal{C}$ has (small)
colimits. If $A$ is an object of $\mathcal{C}$, then there is a
natural map
\begin{equation} \label{naturalmapcolim} \varinjlim \hom(A, B_\alpha) \to
\hom(A, \varinjlim B_\alpha) \end{equation}
because if one is given a map $A \to B_\beta$ for some $\beta$, one
naturally gets a map from $A$ into the colimit by composing with $B_\beta
\to \varinjlim B_\alpha$. (Note that the left colimit is one of sets!)
In general, the map \cref{naturalmapcolim} is neither injective or surjective.
\begin{example}
Consider the category of sets. Let $A = \mathbb{N}$ and $B_n = \left\{1,
\dots, n\right\}$ be the inductive system indexed by the natural numbers
(where $B_n \to B_{m}, n \leq m$ is the obvious map). Then $\varinjlim B_n =
\mathbb{N}$, so there is a map
\[ A \to \varinjlim B_n, \]
which does not factor as
\[ A \to B_m \]
for any $m$. Consequently, $\varinjlim \hom(A, B_n) \to \hom(A, \varinjlim
B_n)$ is not surjective.
\end{example}
\begin{example}
Next we give an example where the map fails to be injective. Let $B_n =
\mathbb{N}/\left\{1, 2, \dots, n\right\}$, that is, the quotient set of
$\mathbb{N}$ with the first $n$ elements collapsed to one element.
There are natural maps $B_n \to B_m$ for $n \leq m$, so the
$\left\{B_n\right\}$ form an inductive system. It is easy to see that the
colimit $\varinjlim B_n = \left\{\ast \right\}$: it is the one-point set.
So it follows that $\hom(A, \varinjlim B_n)$ is a one-element set.
However, $\varinjlim \hom(A , B_n)$ is \emph{not} a one-element set.
Consider the family of maps $A \to B_n$ which are just the natural projections
$\mathbb{N} \to \mathbb{N}/\left\{1, 2, \dots, n\right\}$ and the family of
maps $A \to B_n$ which map the whole of $A$ to the class of $1$.
These two families of maps are distinct at each step and thus are distinct in
$\varinjlim \hom(A, B_n)$, but they induce the same map $A \to \varinjlim B_n$.
\end{example}
Nonetheless, if $A$ is a \emph{finite set}, it is easy to see that for any
sequence of sets $B_1 \to B_2 \to \dots$, we have
\[ \varinjlim \hom(A, B_n) = \hom(A, \varinjlim B_n). \]
\begin{proof}
Let $f: A \to \varinjlim B_n$. The range of $A$ is finite, containing say
elements $c_1, \dots, c_r \in \varinjlim B_n$. These all come from some
elements in $B_N$ for $N$ large by definition of the colimit. Thus we can
define $\widetilde{f}: A \to B_N$ lifting $f$ at a finite stage.
Next, suppose two maps $f_n: A \to B_m,
g_n : A \to B_m$ define the same map $A \to \varinjlim B_n$.
Then each of the finitely many elements of $A$ gets sent to the same point in
the colimit. By definition of the colimit for sets, there is $N \geq m$ such
that the finitely many elements of $A$ get sent to the same points in $B_N$
under $f$ and $g$. This shows that $\varinjlim \hom(A, B_n) \to \hom(A,
\varinjlim B_n)$ is injective.
\end{proof}
The essential idea is that $A$ is ``small'' relative to the long chain of
compositions $B_1 \to B_2 \to \dots$, so that it has to factor through a
finite step.
Let us generalize this.
\begin{definition} \label{smallness}
Let $\mathcal{C}$ be a category, $I $ a class of maps, and $\omega$ an ordinal.
An object $A \in \mathcal{C}$ is said to be $\omega$-\textbf{small} (with
respect to $I$) if
whenever $\{B_\alpha\}$ is an inductive system parametrized by $\omega$ with
maps in $I$, then
the map
\[ \varinjlim \hom(A, B_\alpha) \to \hom(A, \varinjlim B_\alpha) \]
is an isomorphism.
\end{definition}
Our definition varies slightly from that of \cite{Ho07}, where only ``nice''
transfinite sequences $\left\{B_\alpha\right\}$ are considered.
In our applications, we shall begin by restricting ourselves to the category
of $R$-modules for a fixed commutative ring $R$.
We shall also take $I$ to be the set of \emph{monomorphisms,} or
injections.\footnote{There are, incidentally, categories, such as the category
of rings, where a categorical epimorphism may not be a surjection of sets.}
Then each of the maps
\[ B_\beta \to \varinjlim B_\alpha \]
is an injection, so it follows that
$\hom(A, B_\beta )\to \hom (A, \varinjlim B_\alpha)$ is one, and in
particular the canonical map
\begin{equation} \label{homcolimmap} \varinjlim \hom(A, B_\alpha) \to \hom (A,
\varinjlim B_\alpha) \end{equation}
is an \emph{injection.}
We can in fact interpret the $B_\alpha$'s as subobjects of the big module
$\varinjlim B_\alpha$, and think of their union as $\varinjlim B_\alpha$.
(This is not an abuse of notation if we identify $B_\alpha$ with the image in
the colimit.)
We now want to show that modules are always small for ``large'' ordinals
$\omega$.
For this, we have to digress to do some set theory:
\begin{definition}
Let $\omega$ be a \emph{limit} ordinal, and $\kappa$ a cardinal. Then $\omega$ is
\textbf{$\kappa$-filtered} if every collection $C$ of ordinals strictly less
than $\omega$ and of cardinality at most $\kappa$ has an upper bound strictly
less than $\omega$.
\end{definition}
\begin{example} \label{limitordfinfiltered}
A limit ordinal (e.g. the natural numbers $\omega_0$) is $\kappa$-filtered for any finite cardinal $\kappa$.
\end{example}
\begin{proposition}
Let $\kappa$ be a cardinal. Then there exists a $\kappa$-filtered ordinal
$\omega$.
\end{proposition}
\begin{proof}
If $\kappa$ is finite, \cref{limitordfinfiltered} shows that any limit ordinal
will do. Let us thus assume that $\kappa$ is infinite.
Consider the smallest ordinal $\omega$ whose cardinality is strictly greater
than that of $\kappa$. Then we claim that $\omega$ is $\kappa$-filtered.
Indeed, if $C$ is a collection of at most $\kappa$ ordinals strictly smaller
than $\omega$, then each of these ordinals is of size at most $\kappa$. Thus
the union of all the ordinals in $C$ (which is an ordinal) is of size at most
$\kappa$, so is strictly smaller than $\omega$, and it provides an upper bound as in the definition.
\end{proof}
\begin{proposition} \label{modulesaresmall}
Let $M$ be a module, $\kappa$ the cardinality of the set of its submodules.
Then if $\omega$ is $\kappa$-filtered, then $M$ is $\omega$-small (with
respect to injections).
\end{proposition}
The proof is straightforward, but let us first think about a special case. If
$M$ is finite, then the claim is that for any inductive system
$\left\{B_\alpha\right\}$ with injections between them, parametrized by a
limit ordinal, any map $M \to
\varinjlim B_\alpha$ factors through one of the $B_\alpha$. But this is clear.
$M$ is finite, so since each element in the image must land inside one of the
$B_\alpha$, so all of $M$ lands inside some finite stage.
\begin{proof}
We need only show that the map \cref{homcolimmap} is a surjection when
$\omega$ is $\kappa$-filtered.
Let $f: A \to \varinjlim B_\alpha$ be a map.
Consider the subobjects $\{f^{-1}(B_\alpha)\}$ of $A$, where $B_\alpha$ is considered as a
subobject of the colimit. If one of these, say $f^{-1}(B_\beta)$, fills $A$,
then the map factors through $B_\beta$.
So suppose to the contrary that all of the $f^{-1}(B_\alpha)$ were proper
subobjects of $A$.
However, we know that
\[ \bigcup f^{-1}(B_\alpha) = f^{-1}\left(\bigcup B_\alpha\right) = A. \]
Now there are at most $\kappa$ different subobjects of $A$ that occur among
the $f^{-1}(B_\alpha)$, by hypothesis.
Thus we can find a set $A$ of cardinality at most $\kappa$ such that as
$\alpha'$ ranges over $A$, the
$f^{-1}(B_{\alpha'})$ range over \emph{all} the $f^{-1}(B_\alpha)$.
However, $A$ has an upper bound $\widetilde{\omega} < \omega$ as $\omega$ is
$\kappa$-filtered. In particular,
all the $f^{-1}(B_{\alpha'})$ are contained in
$f^{-1}(B_{\widetilde{\omega}})$. It follows that
$f^{-1}(B_{\widetilde{\omega}}) = A$.
In particular, the map $f$ factors through $B_{\widetilde{\omega}}$.
\end{proof}
From this, we will be able to deduce the existence of lots of injectives.
Let us recall the criterion of Baer (\cref{baercriterion}): a module $Q$ is
injective if and only if in every commutative diagram
\[ \xymatrix{
\mathfrak{a} \ar[d] \ar[r] & Q \\
R \ar@{-->}[ru]
}\]
for $\mathfrak{a} \subset R$ an ideal, the dotted arrow exists. In other
words, we are trying to solve an \emph{extension problem} with respect to the
inclusion $\mathfrak{a} \hookrightarrow R$ into the module $M$.
If $M$ is an $R$-module, then in general we may have a semi-complete diagram as above. In
it, we can form the \emph{push-out}
\[ \xymatrix{
\mathfrak{a} \ar[d] \ar[r] & Q \ar[d] \\
R \ar[r] & R \oplus_{\mathfrak{a}} Q
}.\]
Here the vertical map is injective, and the diagram commutes. The point is
that we can extend $\mathfrak{a} \to Q$ to $R$ \emph{if} we extend $Q$ to the
larger module $R \oplus_{\mathfrak{a}} Q$.
The point of the small object argument is to repeat this procedure
transfinitely many times.
\begin{theorem}
Let $M$ be an $R$-module. Then there is an embedding $M \hookrightarrow Q$ for
$Q$ injective.
\end{theorem}
\begin{proof}
We start by defining a functor $\mathbf{M}$ on the category of $R$-modules.
Given $N$, we consider the set of all maps $\mathfrak{a} \to N$ for
$\mathfrak{a} \subset R$ an ideal, and consider the push-out
\begin{equation} \label{hugediag}
\xymatrix{
\bigoplus \mathfrak{a}\ar[r] \ar[d] & N \ar[d] \\
\bigoplus R \ar[r] & N \oplus_{\bigoplus \mathfrak{a}} \bigoplus R
}
\end{equation}
where the direct sum of copies of $R$ is taken such that every copy of an
ideal $\mathfrak{a}$ corresponds to one copy of $R$.
We define $\mathbf{M}(N)$ to be this push-out. Given a map $N \to N'$, there
is a natural morphism of diagrams \cref{hugediag}, so $\mathbf{M}$ is a
functor.
Note furthermore that there is a natural transformation
\[ N \to \mathbf{M}(N), \]
which is \emph{always an injection.}
The key property of $\mathbf{M}$ is that if $\mathfrak{a} \to N$ is any
morphism, it can be extended to $R \to \mathbf{M}(N)$, by the very
construction of $\mathbf{M}(N)$. The idea will now be to
apply $\mathbf{M}$ a transfinite number of times and to use the small object
property.
We define for each ordinal $\omega$ a functor $\mathbf{M}_{\omega}$ on the
category of $R$-modules, together with a natural injection $N \to
\mathbf{M}_{\omega}(N)$. We do this by transfinite induction.
First, $\mathbf{M}_1 = \mathbf{M}$ is the functor defined above.
Now, suppose given an ordinal $\omega$, and suppose $\mathbf{M}_{\omega'}$ is
defined for $\omega' < \omega$. If $\omega$ has an immediate predecessor
$\widetilde{\omega}$, we let
$$\mathbf{M}_{\omega} = \mathbf{M} \circ \mathbf{M}_{\widetilde{\omega}}.$$
If not, we let $\mathbf{M}_{\omega}(N) = \varinjlim_{\omega' < \omega}
\mathbf{M}_{\omega'}(N)$.
It is clear (e.g. inductively) that the $\mathbf{M}_{\omega}(N)$ form an inductive system over
ordinals $\omega$, so this is reasonable.
Let $\kappa$ be the cardinality of the set of ideals in $R$, and let $\Omega$
be a $\kappa$-filtered ordinal.
The claim is as follows.
\begin{lemma}
For any $N$, $\mathbf{M}_{\Omega}(N)$ is injective.
\end{lemma}
If we prove this, we will be done. In fact, we will have shown that there is a
\emph{functorial} embedding of a module into an injective.
Thus, we have only to prove this lemma.
\begin{proof}
By Baer's criterion (\cref{baercriterion}), it suffices to show that if
$\mathfrak{a} \subset R$ is an ideal, then any map $f: \mathfrak{a} \to
\mathbf{M}_{\Omega}(N)$ extends to $R \to \mathbf{M}_{\Omega}(N)$. However, we
know since $\Omega$ is a limit ordinal that
\[ \mathbf{M}_{\Omega}(N) = \varinjlim_{\omega < \Omega}
\mathbf{M}_{\omega}(N), \]
so by \cref{modulesaresmall}, we find that
\[ \hom_R(\mathfrak{a}, \mathbf{M}_{\Omega}(N)) = \varinjlim_{\omega < \Omega}
\hom_R(\mathfrak{a}, \mathbf{M}_{\omega}(N)). \]
This means in particular that there is some $\omega' < \Omega$ such that $f$
factors through the submodule $\mathbf{M}_{\omega'}(N)$, as
\[ f: \mathfrak{a} \to \mathbf{M}_{\omega'}(N) \to \mathbf{M}_{\Omega}(N). \]
However, by the fundamental property of the functor $\mathbf{M}$, we know that
the map $\mathfrak{a} \to \mathbf{M}_{\omega'}(N)$ can be extended to
\[ R \to \mathbf{M}( \mathbf{M}_{\omega'}(N)) = \mathbf{M}_{\omega' + 1}(N), \]
and the last object imbeds in $\mathbf{M}_{\Omega}(N)$.
In particular, $f$ can be extended to $\mathbf{M}_{\Omega}(N)$.
\end{proof}
\end{proof}
\subsection{Split exact sequences}
\add{additive functors preserve split exact seq}
Suppose that
$\xymatrix@1{0 \ar[r] & L \ar[r]^\psi & M \ar[r]^f & N \ar[r] & 0}$
is a split short exact sequence.
Since $\Hom_R (D, \cdot)$ is a left-exact functor, we see that
$$\xymatrix@1{0 \ar[r]
& \Hom_R(D, L) \ar[r]^{\psi'}
& \Hom_R(D, M) \ar[r]^{\f'}
& \Hom_R(D, N)}$$
is exact. In addition,
$\Hom_R (D, L \oplus N) \cong \Hom_R(D, L) \oplus \Hom_R (D, N)$. Therefore, in
the case that we start with a split short exact sequence $M \cong L \oplus N$,
applying $\Hom_R (D, \cdot)$ does yield a split short exact sequence
$$\xymatrix@1{0 \ar[r]
& \Hom_R(D, L) \ar[r]^{\psi'}
& \Hom_R(D, M) \ar[r]^{\f'}
& \Hom_R(D, N) \ar[r] & 0}.$$
Now, assume that
$$\xymatrix@1{0 \ar[r]
& \Hom_R(D, L) \ar[r]^{\psi'}
& \Hom_R(D, M) \ar[r]^{\f'}
& \Hom_R(D, N) \ar[r] & 0}$$
is a short exact sequence of abelian groups for all $R$-modules $D$.
Set $D = R$ and using $\Hom_R (R, N) \cong N$ yields that
$\xymatrix@1{0 \ar[r] & L \ar[r]^\psi & M \ar[r]^f & N \ar[r] & 0}$
is a short exact sequence.
Set $D = N$, so we have
$$\xymatrix@1{0 \ar[r]
& \Hom_R(N, L) \ar[r]^{\psi'}
& \Hom_R(N, M) \ar[r]^{\f'}
& \Hom_R(N, N) \ar[r] & 0}$$
Here, $\f'$ is surjective, so the identity map of $\Hom_R (N, N)$ lifts to a
map $g \in \Hom_R (N, M)$ so that $f \circ g = \f'(g) = id$.
This means that $g$ is a splitting homomorphism for the sequence
$\xymatrix@1{0 \ar[r] & L \ar[r]^\psi & M \ar[r]^f & N \ar[r] & 0}$,
and therefore the sequence is a split short exact sequence.
\section{The tensor product}
\label{sec:tensorprod}
We shall now introduce the third functor of this chapter: the tensor product.
The tensor product's key property is that it allows one to ``linearize''
bilinear maps. When taking the tensor product of rings, it provides a
categorical coproduct as well.
\subsection{Bilinear maps and the tensor product}
Let $R$ be a commutative ring, as usual.
We have seen that the $\hom$-sets $\hom_R(M,N)$ of $R$-modules $M,N$ are themselves
$R$-modules.
Consequently, if we have three $R$-modules $M,N,P$, we can think about
module-homomorphisms
\[ M \stackrel{\lambda}{\to}\hom_R(N,P). \]
Suppose $x \in M, y \in N$. Then we can consider
\( \lambda(x) \in \hom_R(N,P) \)
and thus we can consider the element
\( \lambda(x)(y) \in P. \)
We denote this element $\lambda(x)(y)$, which depends on the variables $x \in
M, y \in N$, by $\lambda(x,y)$ for convenience; it
is a function of two variables $M \times N \to P$.
There are
certain properties of $\lambda(\cdot, \cdot)$ that we list below.
Fix $x , x' \in M$; $y, y' \in N; \ a \in R$. Then:
\begin{enumerate}
\item $\lambda(x,y+y') = \lambda(x,y) + \lambda(x, y')$ because $\lambda(x)$
is
additive.
\item $\lambda(x, ay) = a \lambda(x,y)$ because $\lambda(x)$ is an
$R$-module homomorphism.
\item $\lambda(x+x', y) = \lambda(x,y) + \lambda(x', y)$ because
$\lambda$ is additive.
\item $\lambda(ax, y) = a\lambda(x,y)$ because $\lambda$ is an $R$-module
homomorphism.
\end{enumerate}
Conversely, given a function $\lambda: M \times N \to P$ of two variables satisfying the above properties,
it is easy to see that we can get a morphism of $R$-modules $M \to
\hom_R(N,P)$.
\begin{definition}
An \textbf{$R$-bilinear map $\lambda: M \times N \to P$} is a map satisfying
the above listed conditions. In other words, it is required to be $R$-linear
in each variable separately.
\end{definition}
The previous discussion shows that there is a \emph{bijection} between $R$-bilinear
maps $M \times N \to P$ with $R$-module maps $M \to \hom_R(N,P)$.
Note that the first interpretation is symmetric in $M,N$; the second, by
contrast, can be interpreted in terms of the old concepts of an $R$-module map.
So both are useful.
\begin{exercise}
Prove that a $\mathbb{Z}$-bilinear map out of $\mathbb{Z}/2 \times
\mathbb{Z}/3$ is identically zero, whatever the target module.
\end{exercise}
Let us keep the notation of the previous discussion: in particular, $M,N, P$ will
be modules over a commutative ring $R$.
Given a bilinear map $M \times N \to P$ and a homomorphism $P \to P'$, we can
clearly get a bilinear map $M \times N \to P'$ by composition.
In particular, given $M,N$, there is a \emph{covariant functor} from
$R$-modules to
$\mathbf{Sets}$ sending any $R$-module $P$ to the collection of $R$-bilinear
maps $M \times N
\to P$. As usual, we are interested in when this functor is
\emph{corepresentable.}
As a result,
we are interested in \emph{universal} bilinear maps out of $M \times N$.
\begin{definition}
An $R$-bilinear map $\lambda: M \times N \to P$ is called \textbf{universal} if
for all $R$-modules $Q$, the composition of $P \to Q$ with $M \times N
\stackrel{\lambda}{\to} P$
gives a \textbf{bijection}
\[ \hom_R(P,Q) \simeq \left\{\mathrm{bilinear \ maps} \ M \times N \to
Q\right\} \]
So, given a bilinear map $M \times N \to Q$, there is a \textit{unique} map $P
\to Q$ making the diagram
\[
\xymatrix{
& P \ar[dd] \\
M \times N \ar[ru]^{\lambda} \ar[rd] & \\
& Q
}
\]
Alternatively, $P$ \emph{corepresents} the functor $Q \to
\left\{\mathrm{bilinear \ maps \ } M \times N \to Q\right\}$.
\end{definition}
General nonsense says that given $M,N$, an universal $R$-bilinear map $M
\times N \to P$ is
\textbf{unique} up to isomorphism (if it exists). This follows from \emph{Yoneda's lemma}.
For convenience, we give a direct proof.
Suppose $M \times N \stackrel{\lambda}{\to} P$ was universal and $M \times N
\stackrel{\lambda'}{\to} P'$ is also
universal. Then by the universal property, there are unique maps $P \to P'$
and $P' \to P$ making the
following diagram commutative:
\[
\xymatrix{
& P \ar[dd] \\
M \times N \ar[ru]^{\lambda} \ar[rd]^{\lambda'} & \\
& P' \ar[uu]
}
\]
These compositions $P \to P' \to P, P' \to P \to P'$ have to be the identity
because of the uniqueness part of the universal property.
As a result, $P \to P'$ is an isomorphism.
We shall now show that this universal object does indeed exist.
\begin{proposition} \label{tensorexists}
Given $M,N$, a universal bilinear map out of $M \times N$ exists.
\end{proposition}
Before proving it we make:
\begin{definition}
We denote the codomain of the universal map out of $M \times N $ by $M
\otimes_R N$. This is called the \textbf{tensor product} of $M,N$, so there
is a universal bilinear map out of $M \times N$ into $M \otimes_R N$.
\end{definition}
\begin{proof}[Proof of \rref{tensorexists}] We will simply give
a presentation of the tensor product by
``generators and relations.''
Take the free $R$-module $M \otimes_R N$ generated by the symbols $\left\{x
\otimes
y\right\}_{x \in M, y \in N}$ and quotient out by the relations forced upon us
by the definition of a bilinear map (for $x, x' \in M, \ y, y' \in N, \ a
\in R$)
\begin{enumerate}
\item $(x+x') \otimes y = x \otimes y + x' \otimes y$.
\item $(ax) \otimes y = a(x \otimes y) = x \otimes (ay)$.
\item $x \otimes (y+y') = x \otimes y + x \otimes y'$.
\end{enumerate}
We will abuse notation and denote $x \otimes y$ for its image in $M \otimes_R
N$ (as opposed to the symbol generating the free module).
There is a bilinear map $M \times N \to M \otimes_R N$ sending $(x,y) \to x
\otimes y$; the relations imposed imply that this map is a bilinear map. We
have to check
that it is universal, but this is actually quite direct.
Suppose we had a bilinear map $\lambda: M \times N \to P$. We must construct
a linear map $M
\otimes_R N \to P$.
To do this, we can just give a map on generators, and show that it is zero on
each of the relations.
It is easy to see that to make the appropriate diagrams commute, the linear
map $M \otimes N \to P$ has to send $x \otimes y \to \lambda(x,y)$.
This factors
through the relations on $x \otimes y$ by bilinearity and leads to an
$R$-linear map $M \otimes_{R} N \to P$ such that the following diagram
commutes:
\[
\xymatrix{
M \times N \ar[r] \ar[rd]^{\lambda} & M \otimes_R N \ar[d] \\
& P
}.\]
It is easy to see that $M \otimes_R N \to P$ is unique because the $x \otimes
y$ generate it.
\end{proof}
The theory of the tensor product allows one to do away with bilinear maps and
just think of linear maps.
Given $M, N$, we have constructed an object $M \otimes_R N$. We now wish to see
the functoriality of the tensor product. In fact, $(M,N) \to M \otimes_R N$ is a \emph{covariant
functor} in two variables from $R$-modules to $R$-modules.
In particular, if $M \to M', N \to N'$ are morphisms, there is a canonical map
\begin{equation} \label{tensorisfunctor} M \otimes_R N \to M' \otimes_R N'.
\end{equation}
To obtain \cref{tensorisfunctor}, we take the natural bilinear map $M \times N \to M' \times N'
\to M' \otimes_R N'$ and use the universal property of $M \otimes_R N$ to get
a map out of it.
\subsection{Basic properties of the tensor product}
We make some observations and prove a few basic properties. As the proofs will
show, one powerful way to prove things about an object is to reason about its
universal property. If two objects have the same universal property, they are
isomorphic.
\begin{proposition}
The tensor product is symmetric: for $R$-modules $M,N$, we have $M \otimes_R
N \simeq N \otimes_R M$
canonically.
\end{proposition}
\begin{proof}
This is clear from the universal properties: giving a bilinear map
out of $M \times N$ is the same as a bilinear map out $N \times M$.
Thus $M \otimes_R N$ and $N \otimes_R N$ have the same universal property.
It is also
clear from the explicit construction.
\end{proof}
\begin{proposition}
For an $R$-module $M$, there is a canonical isomorphism $M \to M \otimes_R R$.
\end{proposition}
\begin{proof}
If we think in terms of
bilinear maps, this statement is equivalent to the statement that a bilinear
map $\lambda: M \times R \to P$ is the same as a linear map $M \to N$. Indeed,
to do
this, restrict $\lambda$ to $\lambda(\cdot, 1)$. Given $f: M \to N$,
similarly, we take for $\lambda$ as $\lambda(x,a) = af(x)$. This gives a
bijection as claimed.
\end{proof}
\begin{proposition}
The tensor product is associative. There are canonical isomorphisms $M
\otimes_R (N \otimes_R P) \simeq (M
\otimes_R N) \otimes_R P$.
\end{proposition}
\begin{proof}
There are a few ways to see this: one is to build
it explicitly from the construction given, sending $x \otimes (y \otimes z) \to
(x \otimes y) \otimes z$.
More conceptually, both have the same universal
property: by general categorical nonsense (Yoneda's lemma), we need to show
that for all $Q$, there is a canonical bijection
\[ \hom_R(M \otimes (N \otimes P)), Q) \simeq \hom_R( (M \otimes N)
\otimes P, Q) \]
where the $R$'s are dropped for simplicity. But both of these sets can be
identified with the set of trilinear maps\footnote{Easy to define.} $M \times N
\times P \to Q$. Indeed
\begin{align*}
\hom_R(M \otimes (N \otimes P), Q) & \simeq \mathrm{bilinear} \ M \times (N
\otimes P) \to Q \\
& \simeq \hom(N \otimes P, \hom(M,Q)) \\
& \simeq \mathrm{bilinear} \ N \times P \to \hom(M,Q) \\
& \simeq \hom(N, \hom(P, \hom(M,Q)) \\
& \simeq \mathrm{trilinear\ maps}.
\end{align*}
\end{proof}
\subsection{The adjoint property}
Finally, while we defined the tensor product in terms of a ``universal
bilinear map,'' we saw earlier that bilinear maps could be interpreted as maps
into a suitable $\hom$-set.
In particular, fix $R$-modules $M,N,P$. We know that the set of bilinear maps
$M \times N \to P$ is naturally in bijection with
\[ \hom_R(M, \hom_R(N,P)) \]
as well as with
\[ \hom_R(M \otimes_R, N, P). \]
As a result, we find:
\begin{proposition} For $R$-modules $M,N,P$, there is a natural bijection
\[ \hom_R(M,\hom_R(N,P)) \simeq \hom_R(M \otimes_R N, P). \]
\end{proposition}
There is a more evocative way of phrasing the above natural bijection. Given
$N$, let us define the functors $F_N, G_N$ via
\[ F_N(M) = M \otimes_R N, \quad G_N(P) = \hom_R(N,P). \]
Then the above proposition states that there is a natural isomorphism
\[ \hom_R( F_N(M), P) \simeq \hom_R( M, G_N(P)). \]
In particular, $F_N$ and $G_N$ are \emph{adjoint functors}. So, in a sense,
the operations of $\hom$ and $\otimes$ are dual to each other.
\begin{proposition} \label{tensorcolimit}
Tensoring commutes with colimits.
\end{proposition}
In particular, it follows that if $\left\{N_\alpha\right\}$ is a family of
modules, and $M$ is a module, then
\[ M \otimes_R \bigoplus N_\alpha = \bigoplus M \otimes_R N_\alpha. \]
\begin{exercise}
Give an explicit proof of the above relation.
\end{exercise}
\begin{proof}
This is a formal consequence of the fact that the tensor product is a left
adjoint and consequently commutes with all colimits.
\add{proof}
\end{proof}
In particular, by \cref{tensorcolimit}, the tensor product commutes with \emph{cokernels.}
That is, if $A \to B \to C \to 0$ is an exact sequence of $R$-modules and $M$
is an $R$-module, $A \otimes_R M \to B \otimes_R M \to C \otimes_R M \to 0$ is
also exact, because exactness of such a sequence is precisely a condition on
the cokernel.
That is, the tensor product is \emph{right exact.}
We can thus prove a simple result on finite generation:
\begin{proposition} \label{fingentensor}
If $M, N$ are finitely generated, then $M \otimes_R N$ is finitely generated.
\end{proposition}
\begin{proof}
Indeed, if we have surjections $R^m \to M, R^n \to N$, we can tensor them; we
get a surjection since the tensor product is right-exact.
So have a surjection
$R^{m n} = R^m \otimes_R R^n \to M \otimes_R N$.
\end{proof}
\subsection{The tensor product as base-change}
Before this, we have considered the tensor product as a functor within a
fixed category. Now, we shall see that when one takes the tensor product with a
\emph{ring}, one gets additional structure. As a result, we will be able to
get natural functors between \emph{different} module categories.
Suppose we have a
ring-homomorphism $\phi:R \to R'$. In this case, any $R'$-module can be
regarded as
an $R$-module.
In particular, there is a canonical functor of \emph{restriction}
\[ R'\mbox{-}\mathrm{modules} \to R\mbox{-}\mathrm{modules}. \]
We shall see that the tensor product provides an \emph{adjoint} to this
functor.
Namely, if $M$ has an $R$-module
structure, then $M \otimes_R R'$ has an $R'$ module structure where $R'$ acts
on the right. Since the tensor product is functorial, this gives a functor
in the opposite direction:
\[ R\mbox{-}\mathrm{modules} \to R'\mbox{-}\mathrm{modules}. \]
Let $M'$ be an $R'$-module and $M$ an $R$-module. In view of the above,
we can talk about
\[ \hom_R(M, M') \]
by thinking of $M'$ as an $R$-module.
\begin{proposition}
There is a canonical isomorphism between
\[ \hom_R(M, M') \simeq \hom_{R'}(M \otimes_R R', M'). \]
In particular, the restriction functor and the functor $M \to M \otimes_R R'$
are adjoints to each other.
\end{proposition}
\begin{proof}
We can describe the bijection explicitly. Given an $R'$-homomorphism $f:M
\otimes_R R' \to M'$, we get a map
\[ f_0:M \to M' \]
sending
\[ m \to m \otimes 1 \to f(m \otimes 1). \]
This is easily seen to be an $R$-module-homomorphism. Indeed,
\[ f_0(ax) = f(ax \otimes 1) = f(\phi(a)(x \otimes 1)) = a f(x \otimes 1) =
a f_0(x) \]
since $f$ is an $R'$-module homomorphism.
Conversely, if we are given a homomorphism of $R$-modules
\[ f_0: M \to M' \]
then we can define
\[ f: M \otimes_R R' \to M' \]
by sending $m \otimes r' \to r' f_0(m)$, which is a homomorphism of $R'$
modules.
This is well-defined because $f_0$ is a homomorphism of $R$-modules. We leave
some details to the reader.
\end{proof}
\begin{example}
In the representation theory of finite groups, the operation of tensor product
corresponds to the procedure of \emph{inducing} a representation. Namely, if
$H \subset G$ is a subgroup of a group $G$, then there is an obvious
restriction functor from $G$-representations to $H$-representations.
The adjoint to this is the induction operator. Since a $H$-representation
(resp. a $G$-representation) is just a module over the group ring, the
operation of induction is really a special case of the tensor product. Note
that the group rings are generally not commutative, so this should be
interpreted with some care.
\end{example}
\subsection{Some concrete examples}
We now present several concrete computations of tensor products in explicit
cases to illuminate what is happening.
\begin{example} Let us compute $\mathbb{Z}/10 \otimes_{\mathbb{Z}}
\mathbb{Z}/12$.
Since $1$ spans $\mathbb{Z} / (10)$ and $1$ spans $\mathbb{Z} / (12)$,
we see that $1 \otimes 1$ spans $\mathbb{Z} / (10) \otimes \mathbb{Z} /
(12)$ and this tensor
product is a cyclic group.
Note that
$1 \otimes 0 = 1 \otimes (10 \cdot 0) = 10 \otimes 0 = 0 \otimes 0 = 0$
and
$0 \otimes 1 = (12 \cdot 0) \otimes 1 = 0 \otimes 12 = 0 \otimes 0 = 0$.
Now,
$10 (1 \otimes 1) = 10 \otimes 1 = 0 \otimes 1 = 0$
and
$12 (1 \otimes 1) = 1 \otimes 12 = 1 \otimes 0 = 0$,
so the cyclic group $\mathbb{Z} / (10) \otimes \mathbb{Z} / (12)$ has order
dividing both
$10$ and $12$. This means that the cyclic group has order dividing
$\gcd(10, 12) = 2$.
To show that the order of $\mathbb{Z} / (10) \otimes \mathbb{Z} / (12)$,
define a bilinear map
$g: \mathbb{Z} / (10) \times \mathbb{Z} / (12) \to \mathbb{Z} / (2)$ via
$g : (x, y) \mapsto xy$. The universal property of tensor products then
says that there is a unique linear map
$f: \mathbb{Z} / (10) \otimes \mathbb{Z} / (12) \to \mathbb{Z} / (2)$ making
the diagram
\[
\xymatrix{
\mathbb{Z} / (10) \times \mathbb{Z} / (12) \ar[r]^\otimes \ar[rd]_g
& \mathbb{Z} / (10) \otimes \mathbb{Z} / (12) \ar[d]^f \\
& \mathbb{Z} / (2).
}
\]
commute. In particular, this means that $f (x \otimes y) = g(x, y) = xy$.
Hence, $f(1 \otimes 1) = 1$, so $f$ is surjective, and therefore,
$\mathbb{Z} / (10) \otimes \mathbb{Z} / (12)$ has size at least two. This
allows us to
conclude that $\mathbb{Z} / (10) \otimes \mathbb{Z} / (12) = \mathbb{Z} / (2)$.
\end{example}
We now generalize the above example to tensor products of cyclic groups.
\begin{example}
Let $d=\gcd(m,n)$. We will show that
$(\mathbb{Z}/m\mathbb{Z})\otimes(\mathbb{Z}/n\mathbb{Z})\simeq(\mathbb{Z}/d\mathbb{Z})$,
and thus in particular if $m$ and $n$
are relatively prime, then
$(\mathbb{Z}/m\mathbb{Z})\otimes(\mathbb{Z}/n\mathbb{Z})\simeq(0)$. First,
note that
any $a\otimes b\in(\mathbb{Z}/m\mathbb{Z})\otimes(\mathbb{Z}/n\mathbb{Z})$
can be written as $ab(1\otimes 1)$,
so that $(\mathbb{Z}/m\mathbb{Z})\otimes(\mathbb{Z}/n\mathbb{Z})$ is generated
by $1\otimes 1$ and hence
is a cyclic group. We know from elementary number theory that $d=xm+yn$
for some $x,y\in\mathbb{Z}$. We have $m(1\otimes 1)=m\otimes 1=0\otimes
1=0$ and
$n(1\otimes 1)=1\otimes n=1\otimes0=0$. Thus $d(1\otimes 1)=(xm+yn)(1\otimes
1)=0$, so that $1\otimes1$ has order dividing $d$.
Conversely, consider the map
$f:(\mathbb{Z}/m\mathbb{Z})\times(\mathbb{Z}/n\mathbb{Z})\rightarrow(\mathbb{Z}/d\mathbb{Z})$
defined by
$f(a+m\mathbb{Z},b+n\mathbb{Z})=ab+d\mathbb{Z}$. This is well-defined,
since if $a'+m\mathbb{Z}=a+m\mathbb{Z}$
and $b'+n\mathbb{Z}=b+n\mathbb{Z}$ then $a'=a+mr$ and $b'=b+ns$ for some
$r,s$ and
thus $a'b'+d\mathbb{Z}=ab+(mrb+nsa+mnrs)+d\mathbb{Z}=ab+d\mathbb{Z}$
(since $d=\gcd(m,n)$
divides $m$ and $n$). This is obviously bilinear, and hence induces a map
$\tilde{f}:(\mathbb{Z}/m\mathbb{Z})\otimes(\mathbb{Z}/n\mathbb{Z})\rightarrow(\mathbb{Z}/d\mathbb{Z})$,
which
has $\tilde{f}(1\otimes1)=1+d\mathbb{Z}$. But the order of $1+d\mathbb{Z}$
in $\mathbb{Z}/d\mathbb{Z}$ is $d$, so that the order of $1\otimes1$ in
$(\mathbb{Z}/m\mathbb{Z})\otimes(\mathbb{Z}/n\mathbb{Z})$ must be at least
$d$. Thus $1\otimes1$ is in fact
of order $d$, and the map $\tilde{f}$ is an isomorphism between cyclic groups
of order $d$.
\end{example}
Finally, we present an example involving the interaction of $\hom$ and the
tensor product.
\begin{example}
Given an $R$-module $M$, let us use the notation $M^* = \hom_R(M,R)$.
We shall define a functorial map
\[ M^* \otimes_R N \to \hom_R(M,N), \]
and show that it is an isomorphism when $M$ is finitely generated and free.
Define $\rho':M^*\times N\rightarrow\hom_R(M,N)$ by
$\rho'(f,n)(m)=f(m)n$ (note that $f(m)\in R$, and the multiplication $f(m)n$
is that between an element of $R$ and an element of $N$). This is bilinear,
\[\rho'(af+bg,n)(m)=(af+bg)(m)n=(af(m)+bg(m))n=af(m)n+bg(m)n=a\rho'(f,n)(m)+b\rho'(g,n)(m)\]
\[\rho'(f,an_1+bn_2)(m)=f(m)(an_1+bn_2)=af(m)n_1+bf(m)n_2=a\rho'(f,n_1)(m)+b\rho'(f,n_2)(m)\]
so it induces a map $\rho:M^*\otimes N \rightarrow \hom(M,N)$ with
$\rho(f\otimes n)(m)=f(m)n$. This homomorphism is unique since the $f\otimes
n$ generate $M^*\otimes N$. \\
\noindent Suppose $M$ is free on the set $\{a_1,\ldots,a_k\}$. Then
$M^*=\hom(M,R)$ is free on the set $\{f_i:M\rightarrow R,$ $
f_i(r_1a_1+\cdots+r_ka_k)=r_i\}$, because there are clearly no
relations among the $f_i$ and because any $f:M\rightarrow R$ has
$f=f(a_1)f_1+\cdots+f(a_n)f_n$. Also note that any element $\sum h_j\otimes
p_j \in M^*\otimes N$ can be written in the form $\sum_{i=1}^k f_i\otimes
n_i$, by setting $n_i=\sum h_j(a_i)p_j$, and \textit{that this is unique}
because the $f_i$ are a basis for $M^*$.\\
\noindent We claim that the map $\psi:\hom_R(M,N)\rightarrow M^*\otimes N$
defined by $\psi(g)=\sum_{i=1}^k f_i\otimes g(a_i)$ is inverse to $\rho$. Given
any $\sum_{i=1}^k f_i\otimes n_i\in M^*\otimes N$, we have
\[\rho(\sum_{i=1}^k f_i\otimes n_i)(a_j)=\sum_{i=1}^k\rho(f_i\otimes
n_i)(a_j)=\sum_{i=1}^kf_i(a_j)n_i=n_j\]
Thus, $\rho(\sum_{i=1}^k f_i\otimes n_i)(a_i)=n_i$, and thus
$\psi(\rho(\sum_{i=1}^k f_i\otimes n_i))=\sum_{i=1}^k f_i\otimes n_i$. Thus,
$\psi\circ\rho=\id_{M^*\otimes N}$.\\
\noindent Conversely, recall that for $g:M\rightarrow N\in\hom_R(M,N)$,
we defined $\psi(g)=\sum_{i=1}^k f_i\otimes g(a_i)$. Thus,
\[\rho(\psi(g))(a_j)=\rho(\sum_{i=1}^k f_i\otimes
g(a_i))(a_j)=\sum_{i=1}^k\rho(f_i\otimes g(a_i))(a_j)=\sum_{i=1}^k
f_i(a_j)g(a_i)=g(a_j)\]
and because $\rho(\psi(g))$ agrees with $g$ on the $a_i$, it is the
same element of $\hom_R(M,N)$ because the $a_i$ generate $M$. Thus,
$\rho\circ\psi=\id_{\hom_R(M,N)}$.\\
\noindent Thus, $\rho$ is an isomorphism.
\end{example}
We now interpret localization as a tensor product.
\begin{proposition} \label{locisbasechange}
Let $R$ be a commutative ring, $S \subset R$ a multiplicative subset. Then
there
exists a canonical isomorphism of functors:
\[ \phi: S^{-1}M \simeq S^{-1 }R \otimes_R M . \]
\end{proposition}
\begin{proof}
Here is a construction of $\phi$. If $x/s \in S^{-1}M$ where $x \in M, s \in
S$, we define
\[ \phi(x/s) = (1/s) \otimes m. \]
Let us check that this is well-defined. Suppose $x/s = x'/s'$; then this means
there is $t \in S$ with
\[ xs't = x'st . \]
From this we need to check that $\phi(x/s) = \phi(x'/s')$, i.e. that $1/s
\otimes x$ and $1/s' \otimes x'$ represent the same elements in the tensor
product. But we know from the last statement that
\[ \frac{1}{ss't} \otimes xs't = \frac{1}{ss't} x'st \in S^{-1}R \otimes M \]
and the first is just
\[ s't( \frac{1}{ss't} \otimes x) = \frac{1}{s} \otimes x \]
by linearity, while the second is just
\[ \frac{1}{s'} \otimes x' \]
similarly. One next checks that $\phi$ is an $R$-module homomorphism, which we
leave to the reader.
Finally, we need to describe the inverse. The inverse $\psi: S^{-1}R \otimes M
\to S^{-1}M$ is easy to construct because it's a map out of the tensor product,
and we just need to give a bilinear map
\[ S^{-1} R \times M \to S^{-1}M , \]
and this sends $(r/s, m)$ to $mr/s$.
It is easy to see that $\phi, \psi$ are inverses to each other by the
definitions.
\end{proof}
It is, perhaps, worth making a small categorical comment, and offering an
alternative argument.
We are given two functors $F,G$ from $R$-modules to $S^{-1}R$-modules, where
$F(M) = S^{-1}R \otimes_R M$ and $G(M) = S^{-1}M$.
By the universal property, the map $M \to S^{-1}M$ from an $R$-module to a
tensor product gives a natural map
\[ S^{-1}R \otimes_R M \to S^{-1}M, \]
that is a natural transformation $F \to G$.
Since it is an isomorphism for free modules, it is an isomorphism for all
modules by a standard argument.
\subsection{Tensor products of algebras}
\label{tensprodalg}
There is one other basic property of tensor products to discuss before moving
on: namely, what happens when one tensors a ring with another ring. We shall
see that this gives rise to \emph{push-outs} in the category of rings, or
alternatively, coproducts in the category of $R$-algebras.
Let $R$ be a commutative ring and suppose $R_1, R_2$ are $R$-algebras. That is, we have ring homomorphisms
\( \phi_0: R \to R_0, \quad \phi_1: R \to R_1. \)
\begin{proposition}
$R_0 \otimes_R R_1$ has the structure of a commutative ring in a natural way.
\end{proposition}
Indeed, this
multiplication multiplies two typical elements $x \otimes y, x' \otimes y'$ of
the tensor product by
sending them to
$xx' \otimes yy'$.
The ring structure is determined by this formula. One ought to check that this
approach respects the relations of the tensor product. We will do so in an
indirect way.
\begin{proof}
Notice that giving a multiplication law on $R_0 \otimes_R R_1$ is equivalent to giving an $R$-bilinear map
\[ (R_0 \otimes_R R_1) \times (R_0 \otimes R_1) \to R_0 \otimes_R R_1,\]
i.e. an $R$-linear map
\[ (R_0 \otimes_R R_1) \otimes_R (R_0 \otimes R_1) \to R_0 \otimes_R R_1\]
which satisfies certain constraints (associativity, commutativity, etc.).
But the left side is isomorphic to $(R_0 \otimes_R R_0) \otimes_R (R_1
\otimes_R R_1)$. Since we have bilinear maps $R_0 \times R_0 \to R_0$ and $R_1
\times R_1 \to R_1$, we get linear maps
$R_0 \otimes_R R_0 \to R_0$ and $R_1 \otimes_R R_1 \to R_1$.
Tensoring these maps gives the multiplication as a bilinear map. It is easy to
see that these two approaches are the same.
We now need to check that this operation is commutative and associative, with
$1 \otimes 1$ as a unit; moreover, it distributes over addition. Distributivity
over addition is built into the construction (i.e. in view of bilinearity). The
rest (commutativity, associativity, units) can be checked directly on the
generators, since we have distributivity.
We shall leave the details to the reader.
\end{proof}
We can in fact describe the tensor product of $R$-algebras by a universal
property. We will
describe a commutative diagram:
\[
\xymatrix{
& R \ar[rd] \ar[ld] & \\
R_0 \ar[rd] & & R_1 \ar[ld] \\
& R_0 \otimes_R R_1
}
\]
Here $R_0 \to R_0 \otimes_R R_1$ sends $x \mapsto x \otimes 1$; similarly for $R_1
\mapsto R_0 \otimes_R R_1$. These are ring-homomorphisms, and it is easy to
see that
the above
diagram commutes, since $r \otimes 1 = 1 \otimes r = r(1 \otimes 1)$ for $r \in
R$.
In fact,
\begin{proposition}
$R_0 \otimes_R R_1$ is universal with respect to this property: in the language
of category theory, the above diagram is a pushout square.
\end{proposition}
This means for any commutative ring $B$, and every pair of maps $u_0: R_0 \to
B$ and $u_1: R_1 \to B$ such that the pull-backs $R \to R_0 \to B$ and $R \to
R_1 \to B$ are the same, then we get a unique map of rings
\[ R_0 \otimes_R R_1 \to B \]
which restricts on $R_0, R_1$ to the morphisms $u_0, u_1$ that we started with.
\begin{proof} If $B$ is a ring as in the previous paragraph, we make $B$ into an $R$-module by the map $R \to R_0 \to B$ (or
$R \to R_1 \to B$, it is the same by assumption).
This map $R_0 \otimes_R R_1 \to B$ sends
\[ x \otimes y \to u_0(x) u_1(y). \]
It is easy to check that $(x,y) \to u_0(x)u_1(y)$ is $R$-bilinear (because of
the condition that the two pull-backs of $u_0, u_1$ to $R$ are the same), and
that it gives a homomorphism of rings $R_0 \otimes_R R_1 \to B$ which
restricts to $u_0, u_1$ on $R_0,
R_1$. One can check, for instance, that this is a homomorphism of rings by
looking at the generators.
It is also clear that $R_0 \otimes_R R_1 \to B$ is unique, because we know
that the
map on elements of the form $x \otimes 1$ and $1 \otimes y$ is determined by
$u_0, u_1$; these generate $R_0 \otimes_R R_1$, though.
\end{proof}
In fact, we now claim that the category of rings has \emph{all} coproducts. We
see that the coproduct of any two elements exists (as the tensor product over
$\mathbb{Z}$). It turns out that arbitrary coproducts exist. More generally,
if $\left\{R_\alpha\right\}$ is a family of $R$-algebras, then one can define
an object
\[ \bigotimes_\alpha R_\alpha, \]
which is a coproduct of the $R_\alpha$ in the category of $R$-algebras. To do
this, we simply take the generators as before, as formal objects
\[ \bigotimes r_\alpha, \quad r_\alpha \in R_\alpha, \]
except that all but finitely many of the $r_\alpha$ are required to be the
identity. One quotients by the usual relations.
Alternatively, one may use the fact that filtered colimits exist, and
construct the infinite coproduct as a colimit of finite coproducts (which are
just ordinary tensor products).
\section{Exactness properties of the tensor product}
In general, the tensor product is not exact; it is only exact on the right,
but it can fail to preserve injections. Yet in some important cases it
\emph{is}
exact. We study that in the present section.
\subsection{Right-exactness of the tensor product}
We will start by talking about extent to which tensor products do preserve
exactness under any circumstance.
First, let's recall what is going on. If $M,N$ are $R$-modules over the
commutative ring $R$, we have defined another $R$-module $\hom_R(M,N)$
of morphisms
$M \to N$. This is left-exact as a functor of $N$. In other words, if we fix
$M$ and let $N$ vary, then the construction of homming out of $M$ preserves
kernels.
In the language of category theory, this construction $N \to \hom_R(M,N)$ has
an adjoint. The other construction we discussed last time was this adjoint,
and it is the tensor
product. Namely, given $M,N$ we defined a \textbf{tensor product} $M \otimes_R
N$ such that giving a map $M \otimes_R N \to P$ into some $R$-module $P$
is the same as giving a
bilinear map $\lambda: M \times N \to P$, which in turn is the same as giving
an $R$-linear map
\[ M \to \hom_R(N, P). \]
So we have a functorial isomorphism
\[ \hom_R(M \otimes_R N, P) \simeq \hom_R(M, \hom_R(N,P)). \]
Alternatively, tensoring is the left-adjoint to the
hom functor. By abstract nonsense, it follows that since $\hom(M, \cdot)$
preserves cokernels, the left-adjoint preserves cokernels and is right-exact.
We shall see this directly.
\begin{proposition}
The functor $N \to M \otimes_R N$ is right-exact, i.e. preserves cokernels.
\end{proposition}
In fact, the tensor product is symmetric, so it's right exact in either
variable.
\begin{proof}
We have to show that if $N' \to N \to N'' \to 0$ is exact, then so is
\[ M \otimes_R N' \to M \otimes_R N \to M \otimes_R N'' \to 0. \]
There are a lot of different ways to think about this. For instance, we can
look at the direct construction. The tensor product is a certain quotient of a
free module.
$M \otimes_R N''$ is the quotient of the free module generated by $m \otimes
n'', m \in M, n \in N''$ modulo the usual relations. The map $M \otimes N \to
M \otimes N''$ sends $m \otimes n \to m \otimes n''$ if $n'' $ is the image of
$n$ in $N''$. Since each $n''$ can be lifted to some $n$, it is obvious that
the map $M \otimes_R N \to M \otimes_R N''$ is surjective.
Now we know that $M \otimes_R N''$ is a quotient of $M \otimes_R N$. But which
relations do you have to impose on $M \otimes_R N$ to get $M \otimes_R
N''$? In fact, each relation in $M \otimes_R N''$
can be lifted to a relation in $M \otimes_R N$, but with some redundancy. So
the only thing to quotient
out by is the statement that $x \otimes y, x \otimes y'$ have the same image in
$M \otimes N''$. In particular, we have to quotient out by
\[ x \otimes y - x\otimes y' \ , y - y' \in N' \]
so that if we kill off $x \otimes n'$ for $n' \in N' \subset N$, then we get $M
\otimes N''$. This is a direct proof.
One can also give a conceptual proof. We would like to know that $M \otimes N''$
is the cokernel of $M \otimes N' \to M \otimes N''$. In other words, we'd like
to know that if we mapped $M \otimes_R N$ into some $P$ and the pull-back to $M
\otimes_R N'$, it'd factor uniquely through $M \otimes_R N''$.
Namely, we need to show that
\[ \hom_R(M \otimes N'', P) = \ker(\hom_R(M \otimes N, P) \to \hom_R(M
\otimes N'', P)). \]
But the first is just $\hom_R(N'', \hom_R(M,P))$ by the adjointness property.
Similarly, the second is just
\[ \ker( \hom_R(N, \hom(M,P)) \to \hom_R(N', \hom_R(M,P)) \]
but this last statement is $\hom_R(N'', \hom_R(M,P))$ by just the statement
that $N'' = \mathrm{coker}(N ' \to N)$.
To give a map $N'' $ into some module (e.g. $\hom_R(M,P)$) is the same thing as
giving a map out of $N$ which kills $N'$.
So we get the functorial isomorphism.
\end{proof}
\begin{remark}
Formation of tensor products is, in general, \textbf{not} exact.
\end{remark}
\begin{example} \label{tensorbad}
Let $R = \mathbb{Z}$. Let $M = \mathbb{Z}/2\mathbb{Z}$. Consider the exact
sequence
\[ 0 \to \mathbb{Z} \to \mathbb{Q} \to \mathbb{Q}/\mathbb{Z} \to 0 \]
which we can tensor with $M$, yielding
\[ 0 \to \mathbb{Z}/2\mathbb{Z} \to \mathbb{Q} \otimes_{}
\mathbb{Z}/2\mathbb{Z} \to \mathbb{Q}/\mathbb{Z} \otimes
\mathbb{Z}/2\mathbb{Z} \to 0 \]
I claim that the second thing $\mathbb{Q} \otimes \mathbb{Z}/2\mathbb{Z}$
is zero. This is because by tensoring with
$\mathbb{Z}/2\mathbb{Z}$, we've made multiplication by 2 identically zero. By
tensoring with $\mathbb{Q}$, we've made multiplication by 2 invertible. The
only way to reconcile this is to have the second term zero. In particular, the
sequence becomes
\[ 0 \to \mathbb{Z}/2\mathbb{Z} \to 0 \to 0 \to 0 \]
which is not exact.
\end{example}
\begin{exercise}
Let $R$ be a ring, $I, J \subset R$ ideals. Show that $R/I \otimes_R R/J
\simeq R/(I+J)$.
\end{exercise}
\subsection{A characterization of right-exact functors}
Let us consider additive functors on the category of $R$-modules. So far,
we know a very easy way of getting such functors: given an $R$-module $N$, we
have a functor
\[ T_N: M \to M \otimes_R N. \]
In other words, we have a way of generating a functor on the category of
$R$-modules for each $R$-module. These functors are all right-exact, as we
have seen.
Now we will prove a converse.
\begin{proposition}
Let $F$ be a right-exact functor on the category of $R$-modules that commutes
with direct sums. Then $F$ is isomorphic to some $T_N$.
\end{proposition}
\begin{proof}
The idea is that $N$ will be $F(R)$.
Without the right-exactness hypothesis, we shall construct a natural morphism
\[ F(R) \otimes M \to F(M) \]
as follows. Given $m \in M$, there is a natural map $R \to M$ sending $1 \to
m$. This identifies $M = \hom_R(R, M)$. But functoriality gives a map $F(R)
\times \hom_R(R, M) \to F(M)$, which is clearly $R$-linear; the universal
property of the tensor product now produces the desired transformation
$T_{F(R)} \to F$.
It is clear that $T_{F(R)}(M) \to F(M)$ is an isomorphism for $M = R$, and
thus for $M$ free, as both $T_{F(R)}$ and $F$ commute with direct sums. Now
let $M$ be any $R$-module. There is a ``free presentation,'' that is an exact
sequence
\[ R^I \to R^J \to M \to 0 \]
for some sets $I,J$; we get a commutative, exact diagram
\[ \xymatrix{
T_{F(R)}(R^I)\ar[d] \ar[r] & T_{F(R)} (R^J) \ar[d] \ar[r] & T_{F(R)} (M) \ar[d] \ar[r] & 0 \\
F(R^I) \ar[r] & F(R^J) \ar[r] & F( M )\ar[r] & 0
}\]
where the leftmost two vertical arrows are isomorphisms. A diagram chase now
shows that $T_{F(R)}(M) \to F(M)$ is an isomorphism. In particular, $F \simeq
T_{F(R)}$ as functors.
\end{proof}
Without the hypothesis that $F$ commutes with arbitrary direct sums, we could only draw
the same conclusion on the category of \emph{finitely presented} modules; the
same proof as above goes through, though $I$ and $J$ are required to be
finite.\footnote{Recall that an additive functor commutes with finite direct
sums.}
\begin{proposition}
Let $F$ be a right-exact functor on the category of finitely presented $R$-modules that commutes
with direct sums. Then $F$ is isomorphic to some $T_N$.
\end{proposition}
From this we can easily see that localization at a multiplicative subset $S
\subset R$ is given by tensoring with $S^{-1}R$. Indeed, localization is a
right-exact functor on the category of $R$-modules, so it is given by
tensoring with some module $M$; applying to $R$ shows that $M=S^{-1}R$.
\subsection{Flatness}
In some cases, though, the tensor product is exact.
\begin{definition} \label{flatdefn}
Let $R$ be a commutative ring. An $R$-module $M$ is called \textbf{flat} if the
functor $N \to M \otimes_R N$ is exact. An $R$-algebra is \textbf{flat} if it is flat as an
$R$-module.
\end{definition}
We already know that tensoring with anything is right exact,
so the only thing to be checked for flatness of $M$ is that the operation of tensoring by $M$
preserves injections.
\begin{example}
$\mathbb{Z}/2\mathbb{Z}$ is not flat as a $\mathbb{Z}$-module by
\rref{tensorbad}.
\end{example}
\begin{example} \label{projmoduleisflat}
If $R$ is a ring, then $R$ is flat as an $R$-module, because tensoring by $R$
is the identity functor.
More generally, if $P$ is a projective module (i.e., homming out of $P$
is exact), then $P$ is flat.
\end{example}
\begin{proof}
If $P = \bigoplus_A R$ is free, then tensoring with $P$ corresponds to taking
the direct sum $|A|$ times, i.e.
\[ P \otimes_R M = \bigoplus_A M. \]
This is because tensoring with $R$ preserves (finite or direct) infinite sums.
The functor $M \to \bigoplus_A M$ is exact, so free
modules are flat.
A projective module, as discussed earlier, is a direct summand of a free
module. So if $P$ is projective, $P \oplus P' \simeq \bigoplus_A R$ for some
$P'$. Then we have that
\[ (P \otimes_R M) \oplus (P' \otimes_R M) \simeq \bigoplus_A M. \]
If we had an injection $M \to M'$, then there is a direct sum decomposition
yields a diagram of maps
\[ \xymatrix{
P \otimes_R M \ar[d] \ar[r] & \bigoplus_A M \ar[d] \\
P \otimes_R M' \ar[r] & \bigoplus_A M'
}.\]
A diagram-chase now shows that the vertical map is injective. Namely, the
composition $P \otimes_R M \to \bigoplus_A M'$ is injective, so the vertical
map has to be injective too.
\end{proof}
\begin{example}
If $S \subset R$ is a multiplicative subset, then $S^{-1}R $ is a flat $R$-module, because localization is an
exact functor.
\end{example}
Let us make a few other comments.
\begin{remark}
Let $\phi: R \to R'$ be a homomorphism of rings. Then, first of all, any
$R'$-module can be regarded as an $R$-module by composition with $\phi$. In
particular, $R'$ is an $R$-module.
If $M$ is an $R$-module, we can define
\[ M \otimes_R R' \]
as an $R$-module. But in fact this tensor product is an $R'$-module; it has
an action of $R'$. If $x \in M$ and $a \in R'$ and $b \in R'$, multiplication
of $(x \otimes a) \in M \otimes_R R'$ by $b \in R'$ sends this, \emph{by
definition}, to
\[ b(x \otimes a) = x \otimes ab. \]
It is easy to check that this defines an action of $R'$ on $M \otimes_R R'$.
(One has to check that this action factors through the appropriate relations,
etc.)
\end{remark}
The following fact shows that the hom-sets behave nicely with respect to flat
base change.
\begin{proposition}
Let $M$ be a finitely presented $R$-module, $N$ an $R$-module. Let $S$ be a
flat $R$-algebra. Then the natural map
\[ \hom_R(M,N) \otimes_R S \to \hom_S( M \otimes_R S, N \otimes_R S) \]
is an isomorphism.
\end{proposition}
\begin{proof}
Indeed, it is clear that there is a natural map
\[ \hom_R(M, N) \to \hom_S(M \otimes_R S, N \otimes_R S) \]
of $R$-modules. The latter is an $S$-module, so the universal property gives
the map $\hom_R(M, N) \otimes_R S \to \hom_S(M \otimes_R S, N \otimes_R S)$ as
claimed.
If $N$ is fixed, then we have two contravariant functors
in $M$,
\[ T_1(M) = \hom_R(M, N) \otimes_R S, \quad T_2(M) = \hom_S(M \otimes_R S, N
\otimes_R S). \]
We also have a natural transformation $T_1(M) \to T_2(M)$.
It is clear that if $M$ is \emph{finitely generated} and \emph{free}, then the
natural transformation is an isomorphism (for example, if $M = R$, then we just
have the map $N \otimes_R S \to N \otimes_R S$).
Note moreover that both functors are left-exact: that is, given an exact
sequence
\[ M' \to M \to M'' \to 0, \]
there are induced exact sequences
\[ 0 \to T_1(M'') \to T_1(M) \to T_1(M'), \quad 0 \to T_2(M'') \to T_2(M) \to
T_2(M') .\]
Here we are using the fact that $\hom$ is always a left-exact functor and the
fact that tensoring with $S$ preserves exactness. (Thus it is here that we use
flatness.)
Now the following lemma will complete the proof:
\begin{lemma}
Let $T_1, T_2$ be contravariant, left-exact additive functors from the category of
$R$-modules to the category of abelian groups. Suppose a natural transformation
$t: T_1(M) \to T_2(M)$ is given, and suppose this is an isomorphism whenever
$M$ is finitely generated and free. Then it is an isomorphism for any finitely
presented module $M$.
\end{lemma}
\begin{proof}
This lemma is a diagram chase. Fix a finitely presented $M$, and choose a
presentation
\[ F' \to F \to M \to 0, \]
with $F', F$ finitely generated and free.
Then we have an exact and commutative diagram
\[ \xymatrix{
0 \ar[r] & T_1(M) \ar[d]^{} \ar[r] & T_1(F) \ar[d]^{\simeq} \ar[r] &
T_1(F') \ar[d]^{\simeq} \\
0 \ar[r] & T_2(M) \ar[r] & T_2(F) \ar[r] & T_2(F') .
}\]
By hypotheses, the two vertical arrows to the right are isomorphisms, as
indicated. A diagram chase now shows that the remaining arrow is an
isomorphism, which is what we wanted to prove.
\end{proof}
\end{proof}
\begin{example}
Let us now consider finitely generated flat modules over a principal ideal
domain $R$. By \rref{structurePID}, we know that any such $M$ is isomorphic to a
direct sum $\bigoplus R/a_i$ for some $a_i \in R$. But if any of the $a_i$ is
not zero, then that $a_i$ would be a nonzero zerodivisor on $M$. However, we
know no element of $R - \left\{0\right\}$ can be a zerodivisor on $M$. It
follows that all the $a_i = 0$. In particular, we have proved:
\begin{proposition}
A finitely generated module over a PID is flat if and only if it is free.
\end{proposition}
\end{example}
\subsection{Finitely presented flat modules}
In \cref{projmoduleisflat}, we saw that a projective module over any ring $R$
was automatically flat. In general, the converse is flat. For instance,
$\mathbb{Q}$ is a flat $\mathbb{Z}$-module (as tensoring by $\mathbb{Q}$ is a
form of localization). However, because $\mathbb{Q}$ is divisible (namely,
multiplication by $n$ is surjective for any $n$), $\mathbb{Q}$ cannot be a free
abelian group.
Nonetheless:
\begin{theorem} \label{fpflatmeansprojective}
A finitely presented flat module over a ring $R$ is projective.
\end{theorem}
\begin{proof}
We follow \cite{We95}.
Let us define the following contravariant functor from $R$-modules to $R$-modules.
Given $M$, we send it to $M^* = \hom_\mathbb{Z}(M, \mathbb{Q}/\mathbb{Z})$.
This is made into an $R$-module in the following manner: given $\phi: M \to
\mathbb{Q}/\mathbb{Z}$ (which is just a homomorphism of abelian groups!) and $r
\in R$, we send this to $r\phi$ defined by $(r\phi)(m) = \phi(rm)$.
Since $\mathbb{Q}/\mathbb{Z}$ is an injective abelian group, we see that $M
\mapsto M^*$ is an \emph{exact} contravariant functor from $R$-modules to
$R$-modules.
In fact, we note that $0 \to A \to B \to C \to 0$ is exact implies $0 \to C^* \to B^* \to A^* \to 0$ is exact.
Let $F$ be any $R$-module. There is a natural homomorphism
\begin{equation} \label{twoduals} M^* \otimes_R F \to \hom_R(F, M)^*.
\end{equation}
This is defined as follows. Given $\phi: M \to \mathbb{Q}/\mathbb{Z}$ and $x \in
F$, we define a new map $\hom(F, M) \to \mathbb{Q}/\mathbb{Z}$ by sending a
homomorphism $\psi: F \to M$ to $\phi(\psi(x))$.
In other words, we have a natural map
\[ \hom_{\mathbb{Z}}(M, \mathbb{Q}/\mathbb{Z} ) \otimes_R F \to
\hom_{\mathbb{Z}}( \hom_R(F, M)^*, \mathbb{Q}/\mathbb{Z}). \]
Now fix $M$.
This map \eqref{twoduals} is an isomorphism if $F$ is \emph{finitely
generated} and free.
Both are right-exact (because dualizing is contravariant-exact!).
The ``finite presentation trick'' now shows that the map is an isomorphism if
$F$ is finitely presented.
\add{this should be elaborated on}
Fix now $F$ finitely presented and flat, and consider the above two quantities
in \eqref{twoduals} as functors in $M$.
Then the first functor is exact, so the second one is too.
In particular, $\hom_R(F, M)^*$ is an exact functor in $M$; in particular, if
$M \twoheadrightarrow M''$ is a surjection, then
\[ \hom_R(F, M'')^* \to \hom_R(F, M)^* \]
is an injection. But this implies that
\[ \hom_R(F, M) \to \hom_R(F, M'') \]
is a \emph{surjection,} i.e. that $F$ is projective.
Indeed:
\begin{lemma} $ A \to B \to C $ is exact if and only if $C^* \to B^* \to A^* $ is exact.
\end{lemma}
\begin{proof}
Indeed, one direction was already clear (from $\mathbb{Q}/\mathbb{Z}$ being an
injective abelian group).
Conversely, we note that $M = 0$ if and only if $M^* = 0$ (again by
injectivity and the fact that $(\mathbb{Z}/a)^* \neq 0$ for any $a$).
Thus dualizing reflects isomorphisms: if a map becomes an isomorphism after
dualized, then it was an isomorphism already. From here it is easy to deduce
the result (by applying the above fact to the kernel and image).
\end{proof}
\end{proof}