by
Abstract:
In recent years, a fundamental problem structure has emerged as very useful in a variety of ma- chine learning applications: Submodularity is an intuitive diminishing returns property, stating that adding an element to a smaller set helps more than adding it to a larger set. Similarly to convexity, submodularity allows one to efficiently find provably (near-) optimal solutions for large problems. We present SFO, a toolbox for use in MATLAB or Octave that implements algorithms for mini- mization and maximization of submodular functions. A tutorial script illustrates the application of submodularity to machine learning and AI problems such as feature selection, clustering, inference and optimized information gathering.
Reference:
SFO: A Toolbox for Submodular Function Optimization A. KrauseIn Journal of Machine Learning Research (JMLR), volume 11, 2010
Bibtex Entry:
@article{krause10sfo,
	author = {Andreas Krause},
	journal = {Journal of Machine Learning Research (JMLR)},
	pages = {1141-1144},
	title = {SFO: A Toolbox for Submodular Function Optimization},
	volume = {11},
	year = {2010}}