TY - GEN
T1 - Hard-margin active linear regression
AU - Hazan, Elad
AU - Karnin, Zohar
N1 - Publisher Copyright:
Copyright © (2014) by the International Machine Learning Society (IMLS) All rights reserved.
PY - 2014
Y1 - 2014
N2 - We consider the fundamental problem of linear regression in which the designer can actively choose observations. This model naturally captures various experiment design settings in medical experiments, ad placement problems and more. Whereas previous literature addresses the soft-margin or mean-square-error variants of the problem, we consider a natural machine learning hard-margin criterion. In this setting, we show that active learning admits significantly better sample complexity bounds than the passive learning counterpart, and give efficient algorithms that attain near-optimal bounds.
AB - We consider the fundamental problem of linear regression in which the designer can actively choose observations. This model naturally captures various experiment design settings in medical experiments, ad placement problems and more. Whereas previous literature addresses the soft-margin or mean-square-error variants of the problem, we consider a natural machine learning hard-margin criterion. In this setting, we show that active learning admits significantly better sample complexity bounds than the passive learning counterpart, and give efficient algorithms that attain near-optimal bounds.
UR - http://www.scopus.com/inward/record.url?scp=84919930325&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84919930325&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:84919930325
T3 - 31st International Conference on Machine Learning, ICML 2014
SP - 2492
EP - 2500
BT - 31st International Conference on Machine Learning, ICML 2014
PB - International Machine Learning Society (IMLS)
T2 - 31st International Conference on Machine Learning, ICML 2014
Y2 - 21 June 2014 through 26 June 2014
ER -