Saha, R.Satpathy, A.Addya, S.K.2026-02-042024Journal of Supercomputing, 2024, 80, 8, pp. 10394-104179208542https://doi.org/10.1007/s11227-023-05840-whttps://idr.nitk.ac.in/handle/123456789/21160Function-as-a-service has reduced the user burden by allowing cloud service providers to overtake operational activities such as resource allocation, service deployment, auto-scaling, and load-balancing, to name a few. The users are only responsible for developing the business logic through event-triggered functions catering to an application. Although FaaS brings about multiple user benefits, a typical challenge in this context is the time incurred in the environmental setup of the containers on which the functions execute, often referred to as the cold-start time leading to delayed execution and quality-of-service violations. This paper presents an efficient scheduling strategy FASE that uses a finite-sized warm pool to facilitate the instantaneous execution of functions on pre-warmed containers. Test-bed evaluations over AWS Lambda confirm that FASE achieves a 40% reduction in the average cold-start time and 1.29× speedup compared to the baselines. © The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2023.Computation theoryInternet of thingsQuality of serviceWeb servicesCloud service providersCold-startFast deploymentsFunction-as-a-serviceLoad-BalancingOperational activityResources allocationScalingsServerless computingService deploymentContainersFASE: fast deployment for dependent applications in serverless environments