§ Projective spaces and grassmanians in AG
§ Projective space
- Projective space is the space of all lines through .
- Algebraically constructed as .
- We exclude the origin to remove "degenerate lines", since the subspace spanned by when acted on with is just , which is zero dimensional.
- : dimensional subspaces of .
- : dimensional subspaces of .
- is the space of planes in . Can projectivize eqns by sending to .
- Duality: . We map the subspace to the annihilator of in : That is, we map to the set of all linear functionals that vanish on [ie, whose kernel is ].
- The above implies . is just projective space . subspaces are cut out by linear equations, .
§ G(2, 4)
- These are lines in . This will give us two pairs of points of the form and . That is, we're considering "lines" between "points" (or "vectors") in . Exactly what we need to solve stabbing line problems for computer graphics :)
- Start by taking a 2D plane. The line will pass through a point in the 2D plane. This gives us two degrees of freedom.
- Then take a direction in ordinary Euclidean (or to be precise). This gives us two degrees of freedom.
- Can also be said to be a 2-dim. subspace of a 4-dim. vector space.
- In total, should therefore have four degrees of freedom.
- Take where , and is 2-dimensional subspace.
- is spanned by two vectors . So I can record it as a matrix: . Vector has coordinates .
- If I had taken another basis , there would be an invertible matrix ( ) that sends to . Vice Versa, any invertible matrix gives us a new basis.
- So the redundancy in our choice of parametrization of subspaces (via basis vectors) is captured entirely by the space of s.
- Key idea: compute minors of the matrix .
- This is going to be .
- Note here that we are computing minors of a rectangluar matrix, where we take all possible submatrices and calculate their determinant.
- In this case, we must pick both rows, and we have choices of columns, thus we live in .
- We represent this map as which sends which maps a matrix to its vector of minors.
- The great advantage of this is that we have , since the minor by definition takes a determinant of submatrices, and determinant is multiplicative.
- Thus, we have converted a matrix redundancy of in into a scalar redundancy (of ) in .
- We know how to handle scalar redundancies: Live in projective space!
- Therefore, we have a well defined map . Given a subspace , compute a basis for , then compute the minor of the matrix , and send this to .
§ , projectively
- These are lines in .
- So take two points in , call these and . Again, this gives us a matrix:
- We define which is the minor with columns .
- Then we compress the above matrix as . See that and . So we choose as many s as "useful".
- See that if we change or by a constant, then all the change by the constant, and thus the point itself in does not change.
- We can also change by adding some scaled version of . This is like adding a multiple of the second row to the first row when taking determinants. But this does not change determinants!
- Thus, the actual plucker coordinates are invariant under which two points we choose to parametrize the line in .
- This gives us a well defined map from lines in to points in .
- This is not an onto map; lines in have dimension 4 (need coefficiens, ), while has dimension .
- So heuristically, we are missing "one equation" to cut with to get the image of lines in in .
- This is the famous Plucker relation:
- It suffices to prove the relationship for the "standard matrix":
- In this case, we get c (-b) = 1(ad - bc) + d (-a)
- In general, we get plucker relations :
§ Observations of
- Suppose has non-vanishing first minor. Let be the inverse of the first minor matrix. So . Set .
- Then .
- So the first block is identity. Further, the are unique.
- As we vary , we get we get different 2 dimensional subspaces in . Thus, locally, the grassmanian looks like . This gives us an affine chart!
- We can recover grassmanian from the embedding. Let be the coordinate functions on ( for plucker).
- The equation vanishes on the grassmanian. We can show that the zero set of the equation is exactly the grassmanian.
§ Computing cohomology of
- Take all points of the following form:
- Let's look at the first column: it is . Why not ? Well, I can always cancel the second row by subtracting a scaled version of the first row! (this doesn't change the determinants). Thus, if we have a somewhere, the "complement" must be a .
- Next, we can have something like:
- Here, at the second second column , if we didn't have a , then we could have standardized it and put it into the form of which makes it like the first case! Thus,we must have a to get a case different from the previous.
- If we count the number of s, which is the number of degrees of freedom, we see that of them (the last one) has zero stars ( ), of them has 1 star ( ), two of them have 2 stars ( ), one of them has 3 stars, and one of them as 4 stars.
- This lets us read off the cohomology of the grassmanian: we know the cellular decomposition. Ie, we know the number of cells for different dimensions.
- Alternatively, we can see that over a finite field , we have points. On the other hand, has points. Thus the grassmanian is different from projective space!