Previous work has proposed to solve for a filter which, when placed in front of a camera, improves the colorimetric property by best satisfying the Luther condition. That is, the filtered spectral sensitivities of a camera - after a linear transform - are as close to the color matching functions of the human visual system as possible. By construction, the prior art solves for a filter for a given set of human visual sensitivities, e.g.the XYZ color matching functions or the cone response functions. However, depending on the target spectral sensitivity set, a different optimal filter is found. In this paper, we set out a method to solve for a filter that works equally well for all possible target sensitivity sets of the human visual system. We observe that the cone fundamentals, the CIE XYZ color matching functions or any linear combination thereof, span the same vector space. Thus, we solve for a filter that makes the vector space spanned by the filtered camera sensitivities as similar as possible to the space spanned by human vision sensors. We argue that the Vora-Value is a suitable way to measure subspace similarity and we develop an optimization method for finding a filter that maximizes the Vora-Value measure. Experiments demonstrate that our new optimization leads to the filtered camera sensitivities which have a significantly higher Vora-Value and improved colorimetric performance compared with antecedent methods.