Published 18/08/2024
Navigating the Edge: Edge Functions and Serverless Solutions
The Evolution of Cloud Solutions
Cloud computing offers a myriad of options to address different use cases, particularly with the adoption of microservices and event-driven design. Virtual Machines (VMs) allow multiple applications to run on a single resource, while containerized apps enable users to package and deploy their applications across various environments. Serverless computing has further simplified the development process by allowing developers to focus solely on implementing business logic, leaving the management of computing resources, hardware, and operating systems to the cloud provider. The next step in this evolution is edge computing
Edge Computing and Functions
With the rise of IoT and smart devices, edge computing has become a key focus for cloud providers, who are now offering solutions tailored to these use cases. In simple terms, edge computing involves processing data closer to the source, near the data origin or the clients making the requests. This approach can mitigate latency issues, as traditional cloud data centers are often centralized in specific regions around the globe, while clients could be located anywhere. By placing the processing closer to the source, response times improve significantly.
Use Cases for Edge Computing
Edge computing servers can quickly process data and return responses while also sending the data downstream for further aggregation by other servers. This is particularly useful in scenarios like processing video footage from cameras or transforming data from a production plant for further analysis. Edge computing can also facilitate more localized content delivery for web pages and more. Content Delivery Networks (CDNs) like Cloudflare and Akamai have enhanced page load times by caching static assets at the edge. However, advancements in edge computing have expanded its capabilities, offering a host of new features that enhance user experience.
Edge Computing with Cloudflare
Cloudflare, for instance, has developed a comprehensive suite of edge computing solutions. With a vast network spanning 310 cities across 120+ countries, Cloudflare offers robust features such as Workers—a serverless solution that enables you to execute applications at the edge, reaching a global audience once deployed. Additionally, they offer KV, a globally distributed key-value database that serves as storage for applications running on Cloudflare Workers.
Cloudflare global network
Features of Cloudflare Workers
- Supports JavaScript, TypeScript, Python, and Rust
- Bundle size up to 10MB (compressed)
- Worker startup time max 400ms
- Up to 128 environment variables of size 5KB each
- Zero cold start
These features come with certain limitations inherent to edge functions:
- No maximum request duration, but CPU time is limited based on the plan
- Maximum memory of 128MB
- Handles up to 1000 subrequests and 6 simultaneous outgoing requests
Given the nature of edge computing, these limitations are expected compared to VM or container services, where hardware resources can be customized. Tasks like A/B testing or personalized/localized services can be effectively run at the edge, while mission-critical business services requiring high availability and data consistency should continue using other cloud solutions like VMs.
Demo for Cloudflare workers
I deployed a simple application using Cloudflare Workers that determines which version of a web page to serve users based on a predefined feature flag stored on Cloudflare KV. There are other ways to implement content localization or A/B testing based on user metadata, such as region or profile.
Deploying this application at the edge reduces load times and allows caching of the result for repeated requests. By utilizing edge networks, we can implement A/B testing or feature gating logic closer to users, thereby reducing latency. Additional benefits include improved time to first byte (TTFB), which further enhances page performance.
Similarly, content API services can be deployed at the edge to reduce latency for users. When coupled with cache control techniques, solutions can be fine-tuned to determine exactly how content is cached using cache-control headers like max-age, s-max-age, public, or private.
As a simple showcase of this behavior, I deployed two applications: one on Cloudflare’s edge network and another on a serverless function hosted on Vercel in the US-East region. The Cloudflare-hosted application acts as a proxy, requesting data from the serverless function hosted in the US and caching the response.
The request to the origin server in the US takes several seconds due to cold start and geographical distance. However, subsequent requests to the edge function return almost instantaneously, as the previous response was cached, significantly reducing response times to below 100ms. While timings may vary depending on factors such as the request's origin, the location of edge nodes, and the deployment of the origin server, the point remains: edge computing can greatly reduce API latency and enhance your application’s performance for a global audience.
direct query of serverless function
edge function with cached response
Link to demo application can be found here