Papers
arxiv:2505.06641

SneakPeek: Data-Aware Model Selection and Scheduling for Inference Serving on the Edge

Published on May 10, 2025
Authors:
,
,

Abstract

Model selection and scheduling algorithm using accuracy scaling and dynamic accuracy estimation improves efficiency in resource-constrained inference serving environments.

AI-generated summary

Modern applications increasingly rely on inference serving systems to provide low-latency insights with a diverse set of machine learning models. Existing systems often utilize resource elasticity to scale with demand. However, many applications cannot rely on hardware scaling when deployed at the edge or other resource-constrained environments. In this work, we propose a model selection and scheduling algorithm that implements accuracy scaling to increase efficiency for these more constrained deployments. We show that existing schedulers that make decisions using profiled model accuracy are biased toward the label distribution present in the test dataset. To address this problem, we propose using ML models -- which we call SneakPeek models -- to dynamically adjust estimates of model accuracy, based on the underlying data. Furthermore, we greedily incorporate inference batching into scheduling decisions to improve throughput and avoid the overhead of swapping models in and out of GPU memory. Our approach employs a new notion of request priority, which navigates the trade-off between attaining high accuracy and satisfying deadlines. Using data and models from three real-world applications, we show that our proposed approaches result in higher-utility schedules and higher accuracy inferences in these hardware-constrained environments.

Community

Sign up or log in to comment

Get this paper in your agent:

hf papers read 2505.06641
Don't have the latest CLI?
curl -LsSf https://hf.co/cli/install.sh | bash

Models citing this paper 1

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2505.06641 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2505.06641 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.