- Projective space is the space of all lines through $\mathbb R^n$.
- Algebraically constructed as $(V - \{ 0 \})/ \mathbb R^\times$.
- We exclude the origin to remove "degenerate lines", since the subspace spanned by $\{0\}$ when acted on with $\mathbb R^\times$is just $\{ 0 \}$, which is zero dimensional.

- $G(m, V)$: $m$ dimensional subspaces of $V$.
- $G(m, n)$: $m$ dimensional subspaces of $V = k^n$.
- $G(m+1, n+1)$ is the space of $m$ planes $\mathbb P^m$ in $\mathbb P^n$. Can projectivize $(n+1)$ eqns by sending $(x_0, x_1, \dots, x_n) \in k^{n+1}$ to $[x_0 : x_1 : \dots : x_n] \in \mathbb P^n$.
- Duality: $G(m, V) ≃ G(dim(V)-m, V^\star)$. We map the subspace $W \subseteq V$ to the annihilator of $W$ in $V^\star$: That is, we map $W$ to the set of all linear functionals that vanish on $W$ [ie, whose kernel is $W$].
- The above implies $G(1, V) = G(n-1, V)$. $G(1, V)$ is just projective space $\mathbb P^1$. $n-1$ subspaces are cut out by linear equations, $c_0 x_0 + \dots c_{n-1} x_{n-1} + c_n = 0$.

- These are lines in $\mathbb P^3$. This will give us two pairs of points of the form $(x_0, y_0, z_0, w_0)$ and $(x_1, y_1, z_1, w_1)$. That is, we're considering "lines" between "points" (or "vectors") in $\mathbb R^3$. Exactly what we need to solve stabbing line problems for computer graphics :)
- Start by taking a 2D plane. The line will pass through a point in the 2D plane. This gives us two degrees of freedom.
- Then take a direction in ordinary Euclidean $\mathbb R^3$ (or $S^2$ to be precise). This gives us two degrees of freedom.
- Can also be said to be a 2-dim. subspace of a 4-dim. vector space.
- In total, $G(2, 4)$ should therefore have four degrees of freedom.
- Take $W \subseteq V$ where $V \simeq k^4$, and $W$ is 2-dimensional subspace.
- $W$ is spanned by two vectors $v_1, v_2$. So I can record it as a $2x1$ matrix: $\begin{bmatrix} a_{11} & a_{12} & a_{13} & a_{14} \\ a_{21} & a_{22} & a_{23} & a_{24} \end{bmatrix}$. Vector $v_i$ has coordinates $a_i$.
- If I had taken another basis $(v_1', v_2')$, there would be an invertible matrix $B \in K^{2 \times 2}$ ( $det(B) \neq 0$) that sends $(v_1, v_2)$ to $(v_1', v_2')$. Vice Versa, any invertible matrix $B$ gives us a new basis.
- So the redundancy in our choice of parametrization of subspaces (via basis vectors) is captured entirely by the space of $B$s.
- Key idea: compute $2 \times 2$ minors of the $2 \times 4$ matrix $(v_1, v_2)$.
- This is going to be $(a_{11} a_{22} - a_{12} a_{21}, \dots, a_{13} a_{24} - a_{14} a_{23}) \in K^6$.
- Note here that we are computing $2$ minors of a rectangluar matrix, where we take all possible $2 \times 2$ submatrices and calculate their determinant.
- In this case, we must pick both rows, and we have $\binom{4}{2} = 6$ choices of columns, thus we live in $K^6$.
- We represent this map as $m: K^{2 \times 4} \to K^6$ which sends $m((a_{ij})) \equiv (a_{11} a_{22} - a_{12} a_{21}, \dots, a_{13} a_{24} - a_{14} a_{23})$which maps a matrix to its vector of minors.
- The great advantage of this is that we have $m(B \cdot (a_{ij})) = det(B) \cdot m((a_{ij}))$, since the minor by definition takes a determinant of submatrices, and determinant is multiplicative.
- Thus, we have converted a
*matrix*redundancy of $B$ in $a_{ij}$ into aredundancy (of $det(B)$) in $m(a_{ij})$ .*scalar* - We know how to handle scalar redundancies: Live in projective space!
- Therefore, we have a well defined map $G(2, 4) \to \mathbb P^5$. Given a subspace $W \in G(2, 4)$, compute a basis $v_1, v_2 \in K^4$ for $W$, then compute the minor of the matrix $m((v_1, v_2)) \in K^6$, and send this to $P^5$.

- These are lines in $\mathbb P^3$.
- So take two points in $P^3$, call these $[a_0 : a_1 : a_2 : a_3]$ and $[b_0 : b_1 : b_2 : b_3]$. Again, this gives us a matrix:

$\begin{bmatrix}
a_0 &: a_1 &: a_2 &: a_3 \\
b_0 &: b_1 &: b_2 &: b_3 \\
\end{bmatrix}$

- We define $S_{ij} \equiv a_i b_j - a_j b_i$ which is the minor with columns $(i, j)$.
- Then we compress the above matrix as $(S_{01} : S_{02} : S_{03} : S_{12} : S_{13} : S_{23}) \in \mathbb P^5$. See that $S_{ii} = 0$ and $S_{ji} = - S_{ij}$. So we choose as many $S$s as "useful".
- See that if we change $a$ or $b$ by a constant, then all the $S_{ij}$ change by the constant, and thus the point itself in $\mathbb P^5$ does not change.
- We can also change $b$ by adding some scaled version of $a$. This is like adding a multiple of the second row to the first row when taking determinants. But this does not change determinants!
- Thus, the actual plucker coordinates are invariant under which two points $a, b$ we choose to parametrize the line in $\mathbb P^3$.
- This gives us a well defined map from lines in $\mathbb P^3$ to points in $\mathbb P^5$.
- This is not an onto map; lines in $\mathbb P^3$ have dimension 4 (need $3 + 1$ coefficiens, $ax + by + cz + d$), while $\mathbb P^5$ has dimension $5$.
- So heuristically, we are missing "one equation" to cut $\mathbb P^5$ with to get the image of lines in $\mathbb P^3$ in $\mathbb P^5$.
- This is the famous Plucker relation:

$S_{02} S_{13} = S_{01} S_{23} + S_{03} S_{12}$

- It suffices to prove the relationship for the "standard matrix":

$\begin{bmatrix}
1 &: 0 &: a &: b \\
0 &: 1 &: c &: d \\
\end{bmatrix}$

- In this case, we get c (-b) = 1(ad - bc) + d (-a)

- In general, we get
*plucker relations*:

$S_{i_1 \dots i_k}S_{j_1 \dots j_k} = \sum S_{i_1' \dots i_k'} S_{j_1' j_k'}.$

- Suppose $m(v_1, v_2) = (a_{ij})$ has non-vanishing first minor. Let $B$ be the inverse of the first minor matrix. So $B \equiv \begin{bmatrix} a_{11} & a_{12} \\ a_{21} & a_{22} \end{bmatrix}$. Set $(v_1', v_2') \equiv B (v_1, v_2)$.
- Then $m(v_1', v_2') = \begin{bmatrix} 1 & 0 & y_{11} & y_{12} \\ 0 & 1 & y_{21} & y_{22} \end{bmatrix}$.
- So the first $2 \times 2$ block is identity. Further, the $y_{ij}$ are unique.
- As we vary $y_{ij}$, we get we get different 2 dimensional subspaces in $V$. Thus, locally, the grassmanian looks like $A^4$. This gives us an affine chart!
- We can recover grassmanian from the $\mathbb P^5$ embedding. Let $p_0, \dots, p_5$ be the coordinate functions on $\mathbb P^5$ ( $p$ for plucker).
- The equation $p_0 p_5 - p_1 p_4 + p_2 p_3$ vanishes on the grassmanian. We can show that the zero set of the equation is
*exactly*the grassmanian.

- Take all points of the following form:

$\begin{bmatrix}
&1 &:0 &:* &:* \\
&0 &:1 &:* &:*
\end{bmatrix}$

- Let's look at the first column: it is $\begin{bmatrix} 1 \\ 0 \end{bmatrix}$. Why not $\begin{bmatrix} 1 \\ 1 \end{bmatrix}$? Well, I can always cancel the second row by subtracting a scaled version of the first row! (this doesn't change the determinants). Thus, if we have a $1$ somewhere, the "complement" must be a $0$.
- Next, we can have something like:

$\begin{bmatrix}
&1 &:* &:0 &:* \\
&0 &:0 &:1 &:*
\end{bmatrix}$

- Here, at the second second column $\begin{bmatrix} * \\ 0 \end{bmatrix}$, if we didn't have a $0$, then we could have standardized it and put it into the form of $\begin{bmatrix} 0 \\ 1 \end{bmatrix}$ which makes it like the first case! Thus,we
*must*have a $0$ to get a case different from the previous.

- Continuing, we get:

$\begin{bmatrix}
&1 &:* &:* &:0 \\
&0 &:0 &:0 &:1
\end{bmatrix}$

$\begin{bmatrix}
&0 &:1 &:0 &:* \\
&0 &:0 &:1 &:*
\end{bmatrix}$

$\begin{bmatrix}
&0 &:1 &:* &:0 \\
&0 &:0 &:0 &:1
\end{bmatrix}$

$\begin{bmatrix}
&0 &:0 &:1 &:0 \\
&0 &:0 &:0 &:1
\end{bmatrix}$

- If we count the number of $\star$s, which is the number of degrees of freedom, we see that $1$ of them (the last one) has zero stars ( $A^0$), $1$ of them has 1 star ( $A^1$), two of them have 2 stars ( $A^2$), one of them has 3 stars, and one of them as 4 stars.
- This lets us read off the cohomology of the grassmanian: we know the cellular decomposition. Ie, we know the number of $n$ cells for different dimensions.
- Alternatively, we can see that over a finite field $k$, we have $k^0 + k^1 + 2k^2 + k^3 + k^4$ points. On the other hand, $\mathbb P^4$ has $k^0 + k^1 + k^2 + k^3 + k^4$ points. Thus the grassmanian is different from projective space!
- Borcherds