Tensors 2

Tensor as an element of tensor product of vector spaces

Before presenting another way of definition a tensor, we define a notation. A linear map and a bilinear form are respectively written as a linear combination of e_i\varepsilon^j and \varepsilon^i\varepsilon^j. Any of these (for any i,j \le the dimensions of the corrresponding spaces) can be considered as one new object and denoted as for example. \clubsuit_i^j:=e_i\varepsilon^j and \spadesuit^i^j:=\varepsilon^i\varepsilon^j. The writing of the basis vectors and/or basis covectors adjacent to each other is usually denoted by e_i\otimes\varepsilon^j and \varepsilon^i\otimes\varepsilon^j. This notation is referred to as tensor product of (basis) vectors. A general definition will be presented later. Using this notation, for now, we can write a linear map and a bilinear form as,

    \[T=T_j^ie_i\otimes\varepsilon^j\quad\quad \mathfrak B = \mathfrak B_{ij}\varepsilon^i\otimes\varepsilon^j\]

This notation can be extended to be used with any finite linear combination of tensor products of basis vectors and/or covectors where the combination coefficients takes indices following the index level convension. For example we can write,

    \[\mathcal T:=\mathcal T^{ijlt}_{ks} e_i\otimes e_j\otimes e_l\otimes e_t \otimes \varepsilon^k\otimes \varepsilon^s    \]

Let’s define tensor product of vectors and covectors and their rules.

Tensor product of vectors and covectors

Let u, v and w be vectors or covectors (not necessarily basis ones), the we define the tensor product of each pair as uv \equiv u\otimes v, and etc., and the following rules and operations,

0. Order matters: u\otimes v \ne v\otimes u

  1. Scalar multiplication: \alpha (u\otimes v) = (\alpha u) \otimes v = u\otimes (\alpha v).
  2. Addition: u\otimes (v+w)=u\otimes v + u \otimes w and (v+w)\otimes u = v\otimes  u + w \otimes u.

The above rules can be extended to tensor product of any number of vectors or covectors. For example,

1. Scalar multiplication:

    \[\alpha (u\otimes v\otimes w \otimes x \otimes y) = (\alpha u) \otimes v \otimes w \otimes x \otimes y= u\otimes (\alpha v) \otimes w \otimes x \otimes y = \cdots = u\otimes v \otimes w \otimes x \otimes (\alpha y)\]

2. Addition:

    \[u\otimes v \otimes w \otimes x \otimes y + u\otimes v \otimes z \otimes x \otimes y = u\otimes v \otimes (w+z)\otimes x \otimes y\]

The above rules can be recruited to construct vector spaces, called tensor-product vector spaces. For example, if v\in V and f\in V^*, then,

    \[v\otimes f \in V\bigotimes V^*\]

Any vector spaces can get into a tensor product. For example, V\bigotimes V\bigotimes V^*\bigotimes V\bigotimes V^* with members like u\otimes v \otimes f \otimes w \otimes h with u,v,w \in V and f,h\in V^*.

Note that tensor product of vector space can be done on totally different vector spaces over the same field, e.g. V\bigotimes W\bigotimes U^*.

Basis for a tensor product space

Let V and W be vector spaces with bases \{e_i\}_1^n and \{\zeta_j\}_1^m respectively. if v\in V and w\in W, we can write,

    \[v\otimes w = (v^ie_i)\otimes (w^j\zeta_j)=v^i w^j e_i\otimes \zeta_j\]

This states that any vector v\otimes w \in V\bigotimes W can be written as a linear combination of e_i\otimes \zeta_j. Therefore,

  1. The set of vectors \{e_i\otimes \zeta_j | i=1,\cdots, n \text {and } j=1,\cdots, m\} is a basis for the vector space V\bigotimes W.
  2. The dimension of the vector space V\bigotimes W is n\times m.

The above can be extended to tensor product of any number of vector spaces, i.e. the tensor product of the basis vectors of vector spaces creates a basis for the resultant tensor product space.

Example: Let u=u^ie_i\in V and f=f_i\varepsilon^i,h=h_j\varepsilon^j \in V^* and \alpha\in \mathbb R.

Then, \{e_i\otimes \varepsilon^j\}_{i,j=1}^n is a basis for V\bigotimes V^*, and,

    \[\begin{split}V\bigotimes V^* &\ni u\otimes f + \alpha u\otimes h = u\otimes (f + \alpha h) \\& =u^ie_i\otimes(f_j\varepsilon^j + \alpha h_k\varepsilon^k)=u^if_je_i\otimes\varepsilon^j+\alpha u^ih_ke_i\otimes\varepsilon^k\\&\overset{k \text{ is dummy}}{=}u^if_je_i\otimes\varepsilon^j+\alpha u^ih_je_i\otimes\varepsilon^j=u^i(f_j +\alpha h_j)(e_i\otimes\varepsilon^j)\equiv \varphi_{ij}e_i\otimes\varepsilon^j\end{split}\]

Tensor by tensor product

Definition (Tensor-product view): Tensor is a collection of vectors and covectors combined together by using the tensor product (of vectors and/or covectors). A tensor \mathcal T of type (r,s) is a member of the tensor product space,

    \[\underbrace{V^*\bigotimes \cdots\bigotimes V^*}_{\text{r times}}\bigotimes \underbrace{V\bigotimes \cdots\bigotimes V}_{\text{s times}}\]

and written as,

    \[\mathcal T = \mathcal T_{i_1\cdots i_r}^{j_1\cdots j_s}\varepsilon^{i_1}\otimes\cdots\otimes \varepsilon^{i_r}\otimes e_{j_1}\otimes\cdots\otimes e_{j_s}\]

Note that \mathcal T_{i_1\cdots i_r}^{j_1\cdots j_s} collects the component or the coordinates of the tensor \mathcal T with respect to the basis vectors \{\varepsilon^{i_1}\otimes\cdots\otimes \varepsilon^{i_r}\otimes e_{j_1}\otimes\cdots\otimes e_{j_s}\} or inherently \{\varepsilon^\}

In this view, a vector v=v^ie_i\in V is a (0,1) tensor, a covector f=f_i\varepsilon^i\in V^* is a (1,0) tensor. A linear map T=T_j^i\varepsilon^j\otimes e_i is a (1,1) tensor. A bilinear form T=T_{ij}\varepsilon^i\otimes \varepsilon^j is a (2,0) tensor. A bilinear map T=T_{ij}^k \varepsilon^i\otimes\varepsilon^j\otimes e_k is a (2,1) tensor.