This is a challenging problem in an exercise of SI131 Linear Algebra for Information Science. This article elaborates my thoughts in this problem.
Proposition: Let be linear transformations such that one is not a scalar multiple of the other. Suppose that .
Then there exists a such that are linearly independent.
Proof:
Let be a subspace spanned by a set of basis , be a subspace spanned by a set of basis . There are two cases of and :
If , is linearly independent to basis in .
Find a such that , and are not both zero (thus is not zero). We can guarantee that because we can arbitrarily scale and will not be violated. Also notice that we can always find such and since .
Then we construct a new vector ,
1) If and . Due to is linearly independent to any basis of , is linearly independent to and and are linearly independent.
2) If . is linearly independent to and are linearly independent.
3) If . Similar to 2).
If , is linearly dependent to basis in , then is also spanned by a set of basis , i.e. share a subspace.
Hypothesis: Suppose , are linearly dependent. Therefore, , we have:
W.L.O.G, we only care those cases in which and are non-zero. We can always find these and because .
Then we construct a vector .
1) If and are linearly dependent and . Let , we have:
Since they are still linearly dependent according to the hypothesis, :
Thus, we have .
2) If and are linearly independent.
Since they are still linearly dependent according to the hypothesis, :
Due to the independence of and , .
Thus, we have .
3) For , we have:
If is linearly independent to , and can not be linearly dependent (contradict to our hypothesis). Thus, is always linearly dependent to , even if we choose any basis as . In order to meet our condition , , i.e. . By the symmetric argument on , we have .
Thus, we have .
1), 2), 3) for any , if are linearly dependent, then .
However, are linear transformations such that one is not a scalar multiple of the other. Contradiction, i.e. such that and are linearly independent.