The following lemma is a curious result that I’ve now found useful both in Lie algebras (in the classification of root systems) and coding theory (to prove the Plotkin bounds).

**Lemma.** Suppose that are non-zero vectors in a real inner product space such that for all distinct and . Then

- if where the minimal possible number of are non-zero then all the non-zero have the same sign; and
- are linearly independent if and only if there exists a vector such that for all .

**Proof.** Suppose that there is a non-trivial linear dependency between the . Separating the coefficients according to their sign, we get a relation

where and are disjoint subsets of and the and are strictly positive. Let be the common value of both sides of the above equation. Then

so . This proves the first part of the lemma.

Moreover if a vector exists satisfying the hypothesis of the second part then

which proves the ‘if’ direction of the second part.

For the ‘only if’ direction we need to find a making a small angle with all of the . If we try to construct such a as a linear combination of the , we quickly lose all control over the various inner products. What works much better is to work with vectors known to be perpendicular to most of the . Let be the subspace spanned by the , and choose for each , a non-zero vector perpendicular to all the for . (This is possible because the span of these vectors has codimension at least one in .) Note that , since otherwise . Therefore, if we have

and by taking so that has the opposite sign to we get a suitable .