Let $A$, $B$ be square matrices over infinite field (we identify them with linear operators on the vector space of columns). It is given that for all scalars $a,b$ the matrix $aA+bB$ is singular. Does it follow that there exist matrices $P$, $Q$ such that rank$(P)$+rank$(Q) > n$ but $PAQ=PBQ=0$?

If yes, is the same true for arbitrary subspaces of singular matrices? Well, the answer is no for antisymmetric matrices $3\times 3$... But how can subspaces of singular matrices be described (if they can)?

  • This definitely holds if $A$ and $B$ commute. I don't think it holds generally, but I don't know a counterexample out of my hat. – darij grinberg Nov 25 '10 at 19:18
  • Definitely not for arbitrary subspaces: try the subspace of $3\times 3$ matrices of the form $\left(\begin{array}{ccc} a&0&0 \\ b&0&0 \\ c&d&e \end{array}\right)$. – darij grinberg Nov 25 '10 at 19:21
  • Ok, for two matrices it also doesn't work: $\left(\begin{array}{ccc}1&0&0 \\ 0&0&0 \\ 1&1&0 \end{array}\right)$ and $\left(\begin{array}{ccc}0&0&0 \\ 1&0&0 \\ 0&0&1 \end{array}\right)$. No warranty. – darij grinberg Nov 25 '10 at 19:24
  • oh, thanks, indeed! I need to think a bit to understand what I really wanted to ask instead:) – Fedor Petrov Nov 25 '10 at 19:39
  • 1
    Fedor: Since your problem is invariant under simultaneous multiplication of $A$ and $B$ by invertible matrices from the left and the right, it is a problem about representations of the tame $1$-Kronecker quiver $\tilde{A}_1$. See cs-linux.ubishops.ca/~bruestle/Publications/… for this (it's one of the so-called tame quivers). – darij grinberg Nov 26 '10 at 0:25
up vote 7 down vote accepted

Since the question in the new formulation is quite different, I am adding a new answer. Now the answer is positive, but the proof is not so simple, I will sketch the basic steps.

First of all, assume $A$ and $B$ are matrices of size $n$. Let $V$ and $W$ be $n$-dimensional vector spaces, so $A,B \in Hom(V,W)$. Then consider $P^1$ with coordinates $(x:y)$ and consider the morphism $V\otimes O(-1) \to W\otimes O$ given by $xA + yB$. Let $K$ be its kernel and $C$ its cokernel. Thus we have an exact sequence $$ 0 \to K \to V\otimes O(-1) \to W\otimes O \to C \to 0. $$ The condition of singularity implies $r(K) = r(C) > 0$. Also from the exact sequence it follows that $d(K) = d(C) - n$. Now let us take $Q$ to be the induced map $$ H^1(P^1,K(-1)) \to H^1(P^1,V\otimes O(-2)) = V $$ and $P$ to be the induced map $$ W = H^0(P^1,W\otimes O) \to H^0(P^1,C). $$ Then one can check $Q$ is an embedding, $P$ is a surjection and that $PAQ = PBQ = 0$, so it remains to check that $\dim H^1(P^1,K(-1)) + \dim H^0(P^1,C) > n$. But this can be done like this. First, $$ \dim H^0(P^1,C) \ge \chi(C) = r(C) + d(C). $$ Further, $$ H^1(P^1,K(-1)) \ge - \chi(K(-1)) = - (r(K) + d(K) - r(K)) = -d(K) = n - d(C). $$ Summing up we see that $$ \dim H^1(P^1,K(-1)) + \dim H^0(P^1,C) \ge r(C) + d(C) + n - d(C) = n + r(C) > n. $$

  • 6
    Wow. The question can be explained to a student with a semester of linear algebra; is there no solution at a similar level of sophistication? – Gerry Myerson Nov 25 '10 at 21:48
  • Certainly you can explain the answer in the other languages, but I believe this is the shortest way to explain. – Sasha Nov 26 '10 at 7:22
  • Do you have an answer for the more general question "how can subspaces of singular matrices be described (if they can)"? For subspaces of dimension $>2$ the given criterion is not necessary; eg consider the space $\left\{\begin{bmatrix}a&0&b\\0&a&c\\-c&b&0\end{bmatrix}\mid a,b,c\in k\right\}$ – stewbasic Sep 18 '17 at 23:29
  • @stewbasic: Geometrically, that is the question about linear spaces on the discriminant variety $\mathfrak{D} \subset \mathbb{P}^{n^2-1}$ of degenerate maps. I guess any such space lies inside a maximal one, but I am not sure that the classification of maximal subspaces is known for all $n$. Definitely, there are subspaces $L_{P,Q}$ as above, but for odd $n$ there also subspaces of skew-symmetric matrices up to a change of basis in one of the spaces (I guess your example is equivalent to this). Maybe there are other maximal subspaces as well. – Sasha Sep 19 '17 at 6:45

No. For example you can take $A$ and $B$ to be skew-symmetric and $n$ odd. Then all linear combinations of $A$ and $B$ are skew-symmetric, hence degenerate. But for generic choice of $A$ and $B$ they would not have common kernel or cokernel vector. An explicit example is $$ A = \left(\begin{smallmatrix}0 & 1 & 0\cr -1 & 0 & 0\cr 0 & 0 & 0\end{smallmatrix}\right), \qquad B = \left(\begin{smallmatrix}0 & 0 & 0\cr 0 & 0 & 1\cr 0 & -1 & 0\end{smallmatrix}\right). $$

  • Sasha, it is a little more subttle: you can take $P$ to kill $A$ (at left) and $Q$ to kill $B$ at right. Then you have $PAQ=PBQ=0_3$. However, ${\rm rk}(P)+{\rm rk}(Q)=2<n=3$. – Denis Serre Nov 25 '10 at 20:16
  • 1
    That was the answer to the previous version of the question which did not include $P$ and $Q$. – Sasha Nov 25 '10 at 20:21
  • Yes, thanks, Sasha, your (and Darij's) answer is correct, so I edited a question. – Fedor Petrov Nov 25 '10 at 20:23

Here is a non-answer to the more general question. All the examples noted in the question are generalized by the following construction. For each decomposition $n=p+q+r$ with $q$ odd, matrices of the following form are singular: $$ \begin{bmatrix} *&0&0\\ *&A&0\\ *&*&*\\ \end{bmatrix} $$ where the diagonal blocks are square of size $p,q,r$ and $A$ is anti-symmetric (for characteristic $2$ we require $v^tAv=0$). We can describe the construction in a basis-independent way. Suppose $U$ is a subspace of $V^*\oplus V$ with $\dim U=\dim V$ and $\dim\pi_1(U)+\dim\pi_2(U)+\dim V$ odd, where $\pi_1,\pi_2$ are the component projections from $V^*\oplus V$. Then $$ \{X\in\mathrm{End}(V)\mid\lambda Xu=0\text{ for }(\lambda,u)\in U\} $$ is a space of singular matrices.

The second form seems promising because of the following result. For any space $L$ of singular matrices over an infinite field and $X\in L$ of rank $n-1$, we have $\lambda Lu=0$ where $\lambda,u$ span the kernels of $X^*$ and $X$ (to see this, note that $\mathrm{adj}(X)\propto u\lambda$ and $\mathrm{tr}(\mathrm{adj}(X)Y)$ is the coefficient of $yx^{n-1}$ in $\det(xX+yY)$). However, the construction still isn't exhaustive. Indeed any $L$ produced by the above construction further satisfies $$ \mathrm{tr}(\mathrm{adj}(X)Y\mathrm{adj}(Z)W)+ \mathrm{tr}(\mathrm{adj}(X)W\mathrm{adj}(Z)Y)=0 $$ for $X,Y,Z,W\in L$. But the following four matrices fail this identity and span a space of singular matrices. $$ X=\begin{bmatrix} 1&0&0&0&0\\ 0&0&0&0&0\\ 0&0&0&1&0\\ 0&0&-1&0&0\\ 0&0&0&0&1\end{bmatrix},\, Y=\begin{bmatrix} 1&0&0&0&0\\ 0&0&0&1&0\\ 0&0&0&0&0\\ 0&-1&0&0&0\\ 0&0&0&0&1\end{bmatrix},\, Z=\begin{bmatrix} 1&0&0&0&0\\ 0&0&1&0&0\\ 1&-1&0&0&0\\ 0&0&0&0&0\\ 0&1&0&1&0\end{bmatrix},\, W=\begin{bmatrix} 0&1&0&1&-1\\ 1&0&1&0&1\\ -1&-1&0&0&-2\\ 0&1&-0&1&0\\ 0&0&0&0&0\end{bmatrix}. $$

Your Answer


By clicking "Post Your Answer", you acknowledge that you have read our updated terms of service, privacy policy and cookie policy, and that your continued use of the website is subject to these policies.

Not the answer you're looking for? Browse other questions tagged or ask your own question.