On top-k selection in multi-armed bandits and hidden bipartite graphs

This paper discusses how to efficiently choose from $n$ unknown distributions the $k$ ones whose means are the greatest by a certain metric, up to a small relative error. We study the topic under two standard settings---multi-armed bandits and hidden bipartite graphs---which differ in the nature of...

Full description

Saved in:
Bibliographic Details
Main Authors: CAO, Wei, LI, Jian, TAO, Yufei, LI, Zhize
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2015
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/8671
https://ink.library.smu.edu.sg/context/sis_research/article/9674/viewcontent/NIPS15_full_MAB.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
Description
Summary:This paper discusses how to efficiently choose from $n$ unknown distributions the $k$ ones whose means are the greatest by a certain metric, up to a small relative error. We study the topic under two standard settings---multi-armed bandits and hidden bipartite graphs---which differ in the nature of the input distributions. In the former setting, each distribution can be sampled (in the i.i.d. manner) an arbitrary number of times, whereas in the latter, each distribution is defined on a population of a finite size $m$ (and hence, is fully revealed after m samples). For both settings, we prove lower bounds on the total number of samples needed, and propose optimal algorithms whose sample complexities match those lower bounds.