TransWikia.com

Google Cloud Run memory capacity reached on multiple concurrent requests

Server Fault Asked on November 16, 2021

I have a process that requires a lot of memory as it downloads a file from GCS, analyzes the file, then writes some output back to GCS. The endpoint for this Cloud Run service essentially takes a file id in GCS. If I hit this endpoint multiple times in a second, I get a memory max error and each request dies. I have the Cloud Run service set to 2GB, which is the highest option available. Is this memory shared across all requests? Is Cloud Run the wrong service to be using for this type of workload?

One Answer

I'm not sure of the type of workload you are doing when analyzing the file but it might be slightly intensive. For example, Cloud Run is 2GB max. Cloud Functions is also 2GB. App Engine Standard is also 2GB. However, App Engine Flex and Compute Engine have limits that are much higher. You may also use the pricing calculator for each to see the max. As the name implies, Compute Engine can be used for heavy analysis. But which is best depends on your workflow and this is a very open-ended question.

Answers:

  • Is this memory shared across all requests? It is memory per instance as it scales, but if your file comes-in as one request, then I think that's why it hits the limit.
  • Is Cloud Run the wrong service to be using for this type of workload? It might be if it's a synchronous intensive process

Hope this helps.

Answered by Alexis on November 16, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP