TY - GEN

T1 - The power of selective memory

T2 - 18th Annual Conference on Neural Information Processing Systems, NIPS 2004

AU - Dekel, Ofer

AU - Shalev-Shwartz, Shai

AU - Singer, Yoram

PY - 2005/1/1

Y1 - 2005/1/1

N2 - Prediction suffix trees (PST) provide a popular and effective tool for tasks such as compression, classification, and language modeling. In this paper we take a decision theoretic view of PSTs for the task of sequence prediction. Generalizing the notion of margin to PSTs, we present an online PST learning algorithm and derive a loss bound for it. The depth of the PST generated by this algorithm scales linearly with the length of the input. We then describe a self-bounded enhancement of our learning algorithm which automatically grows a bounded-depth PST.We also prove an analogous mistake-bound for the self-bounded algorithm. The result is an efficient algorithm that neither relies on a-priori assumptions on the shape or maximal depth of the target PST nor does it require any parameters. To our knowledge, this is the first provably-correct PST learning algorithm which generates a bounded-depth PST while being competitive with any fixed PST determined in hindsight.

AB - Prediction suffix trees (PST) provide a popular and effective tool for tasks such as compression, classification, and language modeling. In this paper we take a decision theoretic view of PSTs for the task of sequence prediction. Generalizing the notion of margin to PSTs, we present an online PST learning algorithm and derive a loss bound for it. The depth of the PST generated by this algorithm scales linearly with the length of the input. We then describe a self-bounded enhancement of our learning algorithm which automatically grows a bounded-depth PST.We also prove an analogous mistake-bound for the self-bounded algorithm. The result is an efficient algorithm that neither relies on a-priori assumptions on the shape or maximal depth of the target PST nor does it require any parameters. To our knowledge, this is the first provably-correct PST learning algorithm which generates a bounded-depth PST while being competitive with any fixed PST determined in hindsight.

UR - http://www.scopus.com/inward/record.url?scp=84899015432&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84899015432&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:84899015432

SN - 0262195348

SN - 9780262195348

T3 - Advances in Neural Information Processing Systems

BT - Advances in Neural Information Processing Systems 17 - Proceedings of the 2004 Conference, NIPS 2004

PB - Neural information processing systems foundation

Y2 - 13 December 2004 through 16 December 2004

ER -