Abstract
By introducing a well-motivated information theoretic metric and new convex optimization algorithms, the architecture of a neural network is designed to enhance its supervised learning capability. We formulate two optimization frameworks that allow efficient algorithms for a large number of variables and accommodate a variety of practical constraints on structural randomness of neural networks. Convex optimization is also used for independent component analysis (ICA) and multi-antenna fading channel communication channels.
Original language | English (US) |
---|---|
Pages | 356-359 |
Number of pages | 4 |
State | Published - 2000 |
Event | 2000 IEEE Asia-Pacific Conference on Circuits and Systems: Electronic Communication Systems - Tianjin, China Duration: Dec 4 2000 → Dec 6 2000 |
Other
Other | 2000 IEEE Asia-Pacific Conference on Circuits and Systems: Electronic Communication Systems |
---|---|
Country/Territory | China |
City | Tianjin |
Period | 12/4/00 → 12/6/00 |
All Science Journal Classification (ASJC) codes
- Electrical and Electronic Engineering