```
V --f--→ V
| |
η η
↓ ↓
W --f'-→ W
```

such a map $\eta$ is called called as an intertwining map or an equivariant map.
$T(\alpha(g)(v)) = \beta(g)(T(v))$

Now, since $\beta$ is invertible (it must be since it's a member of $GL(W)$), I can rewrite
the above as:
$\beta^{-1}(g)(T(\alpha (g)(v)) = T(v)$

This needs that $T: V \rightarrow W$ is an intertwining map. Can we generalize this
to any $\begin{aligned}
&\overline{L}: V \rightarrow W \\
&\overline{L}(v) \equiv 1/|G|\sum_{g \in G} \beta(g)^{-1} T \alpha(g) v
\end{aligned}$

We average the intertwining condition of $T(v)$ to produce an appropriate $\overline{L}(v)$.
Is this an intertwining? Yes, because when we compute $\beta(h)^{-1} \overline L \alpha(h)$,
the averaging trick winds up shifting the index, exactly as it does for the inner product:
$\begin{aligned}
&\beta(h)^{-1} \overline L \alpha(h) \\
&=\beta(h)^{-1} \left( \sum_{g \in G} \beta(g)^{-1} L \alpha(g) \right) \alpha(h) \\
&=\sum_{g \in G} \beta(gh)^{-1} L \alpha(gh) \\
&=\sum_{g \in G} \beta(gh)^{-1} L \alpha(gh) \\
&=\sum_{kh^{-1} \in G} \beta(kh^{-1}h)^{-1} L \alpha(kh^{-1}h) \\
&=\sum_{kh^{-1} \in G} \beta(k)^{-1} L \alpha(k) \\
&= \overline{L}
\end{aligned}$

Thus, for every linear map $L: V \rightarrow W$, if the representation $\alpha$ is not
isomorphic to the representation $\beta$, then $\overline{L} = 0$, or:
$\begin{aligned}
&\sum_{g \in G} \beta(g)^{-1} L \alpha(g) = 0 \\
&\left( \sum_{g \in G} \beta(g)^{-1} L \alpha(g) \right)[i][j] = 0[i][j] \\
&\sum_{g \in G} \beta(g)^{-1}[i][p] L[p][q] \alpha(g)[q][j] = 0[i][j] \\
&\text{($\beta$ is unitary)} \\
&\sum_{g \in G} \beta(g)*[p][i] L[p][q] \alpha(g)[q][j] = 0[i][j] \\
\end{aligned}$

The above equality holds $\begin{aligned}
&\sum_{g \in G} \beta(g)*[p][i] L[p][q] \alpha(g)[q][j] = 0[i][j] \\
&\sum_{g \in G} \beta(g)*[p][i] \delta[p][r] \delta[q][c] \alpha(g)[q][j] = 0[i][j] \\
&\sum_{g \in G} \beta(g)*[r][i] \alpha(g)[s][j] = 0[i][j] = 0\\
\end{aligned}$

This tells that we can choose any index $[r, i]$ and index $[i, j]$ and these will be orthogonal,
when viewed as vectors "along" the set of matrices.
If the representation is a one-dimensional representation/character, then we have no freedom
in indexing, and the above becomes:
$\begin{aligned}
&\sum_{g \in G} \beta(g)* \alpha(g) = 0[i][j] = 0
\end{aligned}$

Thus, different characters are all orthogonal.
- The matrix for the identity element is the identity matrix, and the size of the matrix is the size of the vector space, which is $|G|$ since there's a basis vector for each element of $G$. Thus, $r_G(1) = |G|$.
- For any other element $g \in G$, the regular representation will be a permutation matrix with no fixed points. Thus, the diagonal of the matrix is all zeros, and hence $r_G(g) = 0$.

$\begin{aligned}
& \langle r_G | \chi_\alpha \rangle = 1/|G| \sum_{g \in G} r_G(g)* \chi_\alpha(g) \\
&= 1/|G| (r_G(1) \cdot \chi_\alpha(1)) \\
&= 1/|G| (|G| \cdot 1) \\
&= 1
\end{aligned}$

Thus, the regular rep contains the other irreps, since the character of the regular rep has non-zero
inner product with irrep, and irrep characters are all orthogonal.