Network resource allocation is still a challenge in many scenarios where fast services need to be provided in changing and unpredictable situations, for example, the ones encountered in streaming data applications.
This short paper summarizes the results of a PhD thesis, aimed at providing online policies that guarantee performance in network resource allocation. The algorithms are developed for unpredictable environments that assume the existence of an adversary.
The sections present three policies and a property. The first policy is online exact caching, in which a local cache containing the required files can avoid retrieval costs from a remote server. The second is similarity caching, which gives back locally stored objects similar to the one demanded. The third proposes new inference delivery networks, which contain computing nodes coordinated to satisfy machine learning inferences. Finally, the paper discusses the fairness of network resource allocation, which should ensure fairness both at every time slot and over a time horizon.
The explored policies are all developed in an adversarial setting, which makes them of interest to researchers. In order to fully understand the formulations and demonstrations, it is necessary to also read the thesis and a few referenced papers.