by ,
Abstract:
The projection operation is a crucial step in applying Online Gradient Descent (OGD) and its stochastic version SGD. Unfortunately, in some cases, projection is computationally de- manding and inhibits us from applying OGD. In this work, we focus on the special case where the constraint set is smooth and we have an access to gradient and value oracles of the constraint function. Under these assump- tions we design a new approximate projection operation that necessitates only logarithmi- cally many calls to these oracles. We further show that combining OGD with this new ap- proximate projection, results in a projection- free variant that recovers the standard rates of the fully projected version. This applies to both convex and strongly-convex online settings.
Reference:
Projection Free Online Learning Over Smooth Sets K. Y. Levy, A. KrauseIn Proc. International Conference on Artificial Intelligence and Statistics (AISTATS), 2019
Bibtex Entry:
@inproceedings{levy19projection,
	author = {Levy, Kfir Y. and Krause, Andreas},
	booktitle = {Proc. International Conference on Artificial Intelligence and Statistics (AISTATS)},
	month = {April},
	title = {Projection Free Online Learning Over Smooth Sets},
	year = {2019}}