High dimensional model representation (HDMR) is a general set of quantitative model assessment and analysis tools for capturing high dimensional input-output system behavior. In practice, the HDMR component functions are each approximated by an appropriate basis function expansion. This procedure often requires many input-output samples which can restrict the treatment of high dimensional systems. In order to address this problem we introduce svr-based HDMR to efficiently and effectively construct the HDMR expansion by support vector regression (SVR) for a function f(x). In this paper the results for independent variables sampled over known probability distributions are reported. The theoretical foundation of the new approach relies on the kernel used in SVR itself being an HDMR expansion (referred to as the HDMR kernel ), i.e., an ANOVA kernel whose component kernels are mutually orthogonal and all non-constant component kernels have zero expectation. Several HDMR kernels are constructed as illustrations. While preserving the characteristic properties of HDMR, the svr-based HDMR method enables efficient construction of high dimensional models with satisfactory prediction accuracy from a modest number of samples, which also permits accurate computation of the sensitivity indices. A genetic algorithm is employed to optimally determine all the parameters of the component HDMR kernels and in SVR. The svr-based HDMR introduces a new route to advance HDMR algorithms. Two examples are used to illustrate the capability of the method.
|Original language||English (US)|
|Number of pages||26|
|Journal||Journal of Mathematical Chemistry|
|State||Published - Jan 1 2017|
All Science Journal Classification (ASJC) codes
- Applied Mathematics
- Sensitivity analysis