The concept of oriented principal component (OPC) analysis is introduced. It is the extension of the GSVD (generalized singular value decomposition) concept to the case of random processes (much like principal component analysis extends SVD for stochastic signals). In the random signal case, OPC analysis is equivalent to matched filtering and can be found useful in many classification and detection applications. The authors propose a corresponding neural model equipped with an efficient training algorithm for estimating the oriented principal component of two stochastic processes without assuming explicit knowledge of their statistics. The algorithm is based on the (normalized) learning rule proposed by Hebb for training the synaptic weights of a network of neurons. Both the theoretical justification and the numerical performance are shown, giving an explicit estimate of the learning rate parameter for best convergence speed.