Skip to content

Latest commit

 

History

History
50 lines (50 loc) · 1.85 KB

2023-04-11-christianson23a.md

File metadata and controls

50 lines (50 loc) · 1.85 KB
title abstract section layout series publisher issn id month tex_title firstpage lastpage page order cycles bibtex_author author date address container-title volume genre issued pdf extras
Optimal robustness-consistency tradeoffs for learning-augmented metrical task systems
We examine the problem of designing learning-augmented algorithms for metrical task systems (MTS) that exploit machine-learned advice while maintaining rigorous, worst-case guarantees on performance. We propose an algorithm, DART, that achieves this dual objective, providing cost within a multiplicative factor $(1+\epsilon)$ of the machine-learned advice (i.e., consistency) while ensuring cost within a multiplicative factor $2^{O(1/\epsilon)}$ of a baseline robust algorithm (i.e., robustness) for any $\epsilon > 0$. We show that this exponential tradeoff between consistency and robustness is unavoidable in general, but that in important subclasses of MTS, such as when the metric space has bounded diameter and in the $k$-server problem, our algorithm achieves improved, polynomial tradeoffs between consistency and robustness.
Regular Papers
inproceedings
Proceedings of Machine Learning Research
PMLR
2640-3498
christianson23a
0
Optimal robustness-consistency tradeoffs for learning-augmented metrical task systems
9377
9399
9377-9399
9377
false
Christianson, Nicolas and Shen, Junxuan and Wierman, Adam
given family
Nicolas
Christianson
given family
Junxuan
Shen
given family
Adam
Wierman
2023-04-11
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics
206
inproceedings
date-parts
2023
4
11