Tricki
a repository of mathematical know-how

Tensor products

Tensor products

On my blog, asooo writes the following:

If someone needs an idea for an article, I would suggest examples and uses of tensor products. I did learn long time ago that they are the most general bilinear maps but never found how it could be used.

I once wrote an article called "How to lose your fear of tensor products" on my home page, but that was much more about what tensor products are than on what they are good for. Does anyone have some bright ideas for how to illustrate the latter? What I'd really like to see is a problem that does not mention tensor products, which is easy to solve if you use the basic theory of tensor products and a pain to solve otherwise (presumably because you find yourself inventing the theory of tensor products in order to solve the problem). Actually, on writing this I have just thought of such a problem, but it's such a beautiful question that I don't want to give the answer away on the Tricki, and in any case it doesn't illustrate what one might call the "everyday use" of tensor products.

I think the graph theory tensor product is fairly intuitive and can result in a "natural" problem (the complement of the tensor product of a complete graph with itself is a rook graph, for instance). I'm not sure if there's a way to transition to vector spaces from that, though.

I'm not sure I know what the problem is that you are talking about – are you really suggesting a statement that doesn't involve tensor products but which is most easily proved using tensor products? In any case, I'd be a little uneasy about using this as an example unless there had already been several more standard algebraic examples.

How about trying to form a sequence of Walsh matrices? That naturally forces the use of the Kronecker product.

(EDIT: Fiddling more, this is very promising. Not only is there a direct geometric interpretation, but a direct combinatoric one as well, and an explanation can start from a naive standpoint of wanting to create an error-correcting code.)

I saw an easy application in Richard Stanley's Enumerative Combinatorics 2, and wrote it as an example: How to use tensor products. I don't know if this example fits the bill, but I hope that it's okay!

Here's one:

Q: Given two (non-linear) characters of a finite group, how would one show that their product is also a character?

A: If we tensor the corresponding representations, we get a new representation. The character afforded by this new representation is precisely the product of the two original characters.

I'm not sure how good of an example this is, since the question might be hard to motivate.

I am going to soon write a representation theory article that will illustrate
an "everyday use" of tensor products.

I'm not sure what the best way to phrase it formally is, but I think that the standard proof of the tower law

K]

for field extensions is essentially a tensor power-type idea. One doesn't really end up generating the theory of tensor products, though one has the essential idea: "I have a basis v_1, \dots, v_m for L/F and a basis w_1, \dots, w_n for F/K; let me make a basis for L/K by taking the elements v_i w_j."

This crops up reasonably naturally if one starts wondering about things like 2^{1/2} + 3^{1/3}.

Oh, I just saw the article that has been started. I'll add this semi-example to it later unless someone else objects or does it first.

I think examples illustrating how to construct geometric objects representing certain functors should be instructive.

The first example is obviously the construction of products of algebraic varieties. It forces you to "invent" tensor products once you realize that the product in the geometric category should correspond to the coproduct in the algebraic category, at least locally.

The second example came to my is Galois theory. Let F and K be fields. Suppose F contains K. Under some mild assumptions, embeddings of F over K into a universal field are given by the prime spectrum of the tensor product of F with itself over K. Of course, one can do Galois theory without tensor products but, for instance, in the case of differential Galois theory, they can be used to give quite short proofs.

An elementary example of the tensor product of vector spaces can be made seen in real valued functions of two variables.

Given U and V one can construct the space U \otimes V consisting of all functions f(x,y), in particular, functions of the form f(x)g(y) where f(x) \in U and g(y) \in V.

I first saw this in defining a tensor product of Measure Spaces.

One can take this further and define an operator T and choose bases \{p_n\},\{q_n\} for the spaces. Then the matrix tensor (Kronecker) product comes into play.

Furthermore, operators can be factored eg \frac{\partial^2}{\partial^2x}-\frac{\partial^2}{\partial^2y} = \frac{\partial}{\partial u}\frac{\partial}{\partial v}, where u=x+y and v=x-y.

1. Doing away with a basis: If V is a (finite-dimensional) vector space, then the space of all linear functionals is the dual space V*. What does the vector space of all bilinear maps f: V x V -> R look like? You could introduce a matrix A and write f(u,v) = v'Au, or you could write down V* tensor V*, the latter being basis-independent. (And similarly, one can write down an actual bilinear map using the tensor product. Related too, in the infinite dimensional case, the tensor product is a convenient way of writing down a general nuclear operator.)

2. Change of rings: In the simplest case, let V be a real vector space. How to make it into a complex vector space? C tensor V.

3. Why might the tensor product be hard to teach? In part, there may not be an elementary "killer app" for it; it is all too easy simply to omit the tensor product sign and use juxtaposition. In part, its definition presumably evolved through the use of abstraction and hence it is "not immediately obvious" why one would want to use a tensor product.

4. How to teach? Possibly by 1 and 2 above, then arriving at the definition, then explaining in general terms that the usefulness comes from its *unification* properties, for example:
i) In homological algebra, the tensor product unifies several important "operations" on modules (e.g. quotients and localisations), therefore, proving a theorem involving tensor products is more efficient (and insightful).
ii) In algebraic geometry, as mentioned in an earlier comment, A(X x Y) = A(x) tensor A(Y). [Then there is the definition of flatness...]
iii) In differential geometry, it provides a unified way of working with tensor fields.

In summary, propounding the tensor product as a "unification principle" might be more easily digested by students than attempting to exhibit it as an "important tool" in an elementary context.

Post new comment

(Note: commenting is not possible on this snapshot.)
snapshot
Notifications