The generalization performance of SVM-type classifiers severely suffers from the 'curse of dimensionality'. For some real world applications, the dimensionality of the measurement is sometimes significantly larger compared to the amount of training data samples available. In this paper, a classification scheme is proposed and compared with existing techniques for such scenarios. The proposed scheme includes two parts: (i) feature selection and transformation based on Fisher discriminant criteria and (ii) a hybrid classifier combining Kernel Ridge Regression with Support Vector Machine to predict the label of the data. The first part is named Successively Orthogonal Discriminant Analysis (SODA), which is applied after Fisher score based feature selection as a preliminary processing for dimensionality reduction. At this step, SODA maximizes the ratio of between-class-scatter and within-class-scatter to obtain an orthogonal transformation matrix which maps the features to a new low dimensional feature space where the class separability is maximized. The techniques are tested on high dimensional data from a microwave measurements system and are compared with existing techniques.