A growing field in machine learning is to develop computationally friendly algorithms optimized for classification of sensor data at the edge. As many sensors respond to different stimuli with different dynamics, the sensor dynamics may provide valuable information for classification. The problem of determining which of the dynamics has generated a particular observation vector is refered to as a multiclass discrimination problem. A discriminative metric can be computed in a Kronecker space provided the different dynamics are represented with state-space models having distinct observability matrices. In this paper, we express the metric in original space rather than in Kronecker space. One of the main contributions is to show theoretically that the metric for a given system is provided by the norm of the matrix-vector product between the left nullspace of the observability matrix and the observation vector. In practice, when the system is unknown and noisy, an "approximate" nullspace is obtained with a data-driven approach using eigenvalue or singular value decomposition. We tested the classifier on synthetic and real datasets. Results demonstrate the applicability of the method. The new formulation of the metric in original space leads to a classifier with reduced complexity so that learning can be implemented with a few lines of code. This low complexity is suitable for machine learning on low-cost, low-power, and resource-constrained devices like smartphones and sensors.