Server Fault Asked on November 16, 2021
I have a process that requires a lot of memory as it downloads a file from GCS, analyzes the file, then writes some output back to GCS. The endpoint for this Cloud Run service essentially takes a file id in GCS. If I hit this endpoint multiple times in a second, I get a memory max error and each request dies. I have the Cloud Run service set to 2GB, which is the highest option available. Is this memory shared across all requests? Is Cloud Run the wrong service to be using for this type of workload?
I'm not sure of the type of workload you are doing when analyzing the file but it might be slightly intensive. For example, Cloud Run is 2GB max. Cloud Functions is also 2GB. App Engine Standard is also 2GB. However, App Engine Flex and Compute Engine have limits that are much higher. You may also use the pricing calculator for each to see the max. As the name implies, Compute Engine can be used for heavy analysis. But which is best depends on your workflow and this is a very open-ended question.
Answers:
Hope this helps.
Answered by Alexis on November 16, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP