You are viewing a free preview of this lesson.
Subscribe to unlock all 10 lessons in this course and every other course on LearningBro.
Cloud Functions (2nd gen) is the latest version of Google Cloud Functions, built on top of Cloud Run and Eventarc. It offers significant improvements over the 1st generation, including higher performance, longer timeouts, concurrency support, and a unified eventing model. Google recommends using Gen 2 for all new function deployments.
The most significant architectural change in Gen 2 is that functions are deployed as Cloud Run services under the hood. When you deploy a Gen 2 function, Google Cloud automatically creates a Cloud Run service, builds a container image using Cloud Build, stores it in Artifact Registry, and deploys it to Cloud Run. This gives Gen 2 functions all the benefits of Cloud Run's container-based infrastructure.
| Feature | Gen 1 | Gen 2 |
|---|---|---|
| Max timeout | 540 seconds (9 min) | 60 minutes (event-driven), 60 minutes (HTTP) |
| Max memory | 8 GiB | 32 GiB |
| Max vCPUs | 2 | 8 |
| Concurrency | 1 request per instance | Up to 1,000 concurrent requests per instance |
| Min instances | 0 | 0 (configurable minimum) |
| Traffic splitting | Not supported | Supported (revision-based) |
| Eventarc integration | Limited | Full support (130+ event sources) |
One of the most impactful improvements in Gen 2 is concurrency support. In Gen 1, each function instance processes exactly one request at a time. If 100 requests arrive simultaneously, Gen 1 spins up 100 instances. In Gen 2, a single instance can handle up to 1,000 concurrent requests (configurable).
# Deploy with concurrency set to 80
gcloud functions deploy myFunction \
--gen2 \
--runtime python312 \
--trigger-http \
--concurrency 80 \
--region europe-west2
When using concurrency, your function code must be thread-safe. Shared in-memory state (global variables, caches) must be protected with appropriate synchronisation. Each concurrent request shares the same memory allocation, so you may need to increase memory to support more concurrent requests.
Subscribe to continue reading
Get full access to this lesson and all 10 lessons in this course.