Serverless computing in clouds

In serverless computing, cold start refers to the delay that occurs when a serverless function is invoked for the first time after being idle for a period of time, or when it is first deployed. This delay in the start-up time can cause increased latency for the first request and can be a performance...

Full description

Saved in:
Bibliographic Details
Main Author: Wong, Jia Wen
Other Authors: Tang Xueyan
Format: Final Year Project
Language:English
Published: Nanyang Technological University 2023
Subjects:
Online Access:https://hdl.handle.net/10356/165881
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-165881
record_format dspace
spelling sg-ntu-dr.10356-1658812023-04-14T15:37:24Z Serverless computing in clouds Wong, Jia Wen Tang Xueyan School of Computer Science and Engineering ASXYTang@ntu.edu.sg Engineering::Computer science and engineering In serverless computing, cold start refers to the delay that occurs when a serverless function is invoked for the first time after being idle for a period of time, or when it is first deployed. This delay in the start-up time can cause increased latency for the first request and can be a performance concern. Cold start can be mitigated by keeping the instances of serverless functions warm and ready to handle requests. However, keeping a container alive can increase costs for resources used or even cause the resources of the container to be depleted. Therefore, it is important to strike a balance between keeping the container alive long enough to avoid cold starts and keeping it alive for too long by using a well-designed keep-alive policy which adjusts the container’s lifetime or balances container’s priority accordingly. Since resource management for serverless function is similar to object caching, we have implemented caching-inspired algorithm such as Greedy-Dual keep-alive policy and Weighted Greedy-Dual keep-alive policy to evaluate their performance. By analyzing and utilizing the characteristic of FaaS workload, we also aim to design a more principled policy that efficiently reduces cold start overhead. Bachelor of Engineering (Computer Science) 2023-04-14T03:03:42Z 2023-04-14T03:03:42Z 2023 Final Year Project (FYP) Wong, J. W. (2023). Serverless computing in clouds. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/165881 https://hdl.handle.net/10356/165881 en SCSE22-0235 application/pdf Nanyang Technological University
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering::Computer science and engineering
spellingShingle Engineering::Computer science and engineering
Wong, Jia Wen
Serverless computing in clouds
description In serverless computing, cold start refers to the delay that occurs when a serverless function is invoked for the first time after being idle for a period of time, or when it is first deployed. This delay in the start-up time can cause increased latency for the first request and can be a performance concern. Cold start can be mitigated by keeping the instances of serverless functions warm and ready to handle requests. However, keeping a container alive can increase costs for resources used or even cause the resources of the container to be depleted. Therefore, it is important to strike a balance between keeping the container alive long enough to avoid cold starts and keeping it alive for too long by using a well-designed keep-alive policy which adjusts the container’s lifetime or balances container’s priority accordingly. Since resource management for serverless function is similar to object caching, we have implemented caching-inspired algorithm such as Greedy-Dual keep-alive policy and Weighted Greedy-Dual keep-alive policy to evaluate their performance. By analyzing and utilizing the characteristic of FaaS workload, we also aim to design a more principled policy that efficiently reduces cold start overhead.
author2 Tang Xueyan
author_facet Tang Xueyan
Wong, Jia Wen
format Final Year Project
author Wong, Jia Wen
author_sort Wong, Jia Wen
title Serverless computing in clouds
title_short Serverless computing in clouds
title_full Serverless computing in clouds
title_fullStr Serverless computing in clouds
title_full_unstemmed Serverless computing in clouds
title_sort serverless computing in clouds
publisher Nanyang Technological University
publishDate 2023
url https://hdl.handle.net/10356/165881
_version_ 1764208153191251968