Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

"Unimodal Bandits: Regret Lower Bounds and Optimal Algorithms."

Richard Combes, Alexandre Proutière (2014)

Details and statistics

DOI:

access: open

type: Conference or Workshop Paper

metadata version: 2019-05-29