Online Algorithms with Predictions

Loading...
Thumbnail Image

Date

2022

Journal Title

Journal ISSN

Volume Title

Repository Usage Stats

90
views
158
downloads

Abstract

Mitigating uncertainty due to our inherent inability to predict the future is the founding pillar of online algorithms. However, the standard "worst-case" assumption is often too pessimistic, and fails to distinguish between the performance of algorithms in practice. At the same time, there has also been a proliferation of advancements in the field of machine learning, which allows us to generate predictions about the future based on the past. Thus, a natural question arises: How can we use machine-learned predictions about the future to design algorithms that can provably overcome the best worst-case performance guarantees? In this dissertation, we address this question in multiple contexts.

First, we consider paging, a classical problem in online algorithms. There are n pages, each with an associated weight, and there is a cache of size k < n. At each time step, some page is requested, and if it is not in the cache, then the algorithm must evict a page in the cache and fetch the requested page. The goal is to minimize the total weight of fetched pages. For this problem, we show that two existing prediction models do not asymptotically benefit any algorithm. We then define a new prediction model, show that it allows us to obtain a 2-competitive algorithm, and prove various upper and lower bounds in terms of the error in the predictions.

Second, we consider a different prediction model, which we apply to unweighted paging and online covering problems. This model, at each time step, reveals some information about the optimal solution with (small) probability, and does not reveal any meaningful information otherwise. (The exact definition of this revealed information depends on the specific problem.) For paging, we analyze the algorithm that simply follows the prediction, and give a more sophisticated algorithm that achieves the optimal competitive ratio up to a constant factor. For a general class of online covering problems, we give an algorithm that, up to constants, achieves the optimal competitive ratio as well.

Third, we extend the notion of boosting classifiers to online algorithms. More specifically, we consider two notions of boosting applied to the classical ski rental problem: in the first, the algorithm has access to multiple predictors about the future, each of which is slightly more accurate than randomly guessing. We give tight bounds on the number of such predictors required, under multiple settings, to obtain a strong performance guarantee. For our second result, we show how to boost an online algorithm itself. In particular, we show that having access to an online algorithm that is slightly better than a worst-case algorithm allows us to obtain an algorithm whose competitive ratio is arbitrarily close to optimal.

Finally, we consider an allocation problem in the offline setting. Although this problem is offline, it is a generalization of the AdWords problem, a well studied model of online advertising. For our generalization, we consider concave-additive valuations, and define a new notion of curvature for such functions. We then give a general approximation algorithm whose performance depends on this curvature, and prove a corresponding lower bound on the integrality gap. We also demonstrate an application to the Nash Social Welfare objective, which could be of independent interest.

Description

Provenance

Citation

Citation

Sun, Kevin Tang (2022). Online Algorithms with Predictions. Dissertation, Duke University. Retrieved from https://hdl.handle.net/10161/25823.

Collections


Dukes student scholarship is made available to the public using a Creative Commons Attribution / Non-commercial / No derivative (CC-BY-NC-ND) license.