Skip to main content
The 2024 Developer Survey results are live! See the results

What would be a best way to test background processes locally

Created
Active
Viewed 269 times
7 replies
8

I am developing a software using google functions and i have multiple task that i want to do on the server in the background (also report progress back to the app) but i can not find anything that is easy to test locally which is a must for us. I want to test and debug my code locally before deploying my solution.

What are the ideas rank them by complexity and scalability if possible. Thank you for any information.

Little more info about background jobs. I need to download some files analyze them and then store them. Those actions can be done in few seconds to few minutes or in case of larger files half an hour.

7 replies

Sorted by:
78021870
0

Hi!

What kind of testing are you talking about? Unit testing? integration testing?

78027871
1

Have you thought of using Cloud Run instead? It has a larger connection timeout compare to functions and support Jobs in case you need to run background processes!

Also Cloud Run has an emulator you can run locally!

78032208
0

Sure i bumped into this option in the docs but it seemed too hard to implement can you provide me with example ?

78031028
1

A simple way to do this:

  1. Get a local Redis server running - a Docker container is the easiest way.

  2. In you code, you need an awareness as to whether your code is running locally or in Google Cloud. GCP Metadata can help you there - a Node.js library for this is here -https://www.npmjs.com/package/gcp-metadata.

  3. Once you know you are running locally, you will push your batch job as a hash map in Redis.

  4. Then, there is going to to be another process that will need to be notified via Redis pubsub that a batch job is scheduled to be run.

  5. An alternative to pubsub could be Redis queues - https://redis.com/glossary/redis-queue/.

  6. Either way, the second process will pick up batch jobs and execute them in sequence. Since this is local env, you don't need to worry about parallel execution.

  7. This is easy to debug and you can post logs back into Redis as well.

  8. Another advantage of this is that you can run your batch jobs again and again by posting the same payload as Redis hash. This helps in debugging as you don't have to execute the primary code which triggers the batch job.

This sounds complex - but this is a simple way to ensure all your develoeprs are able to run these batch jobs locally.

You will need to have a Docker compose added to your code so that developers can spin up local containers quickly.

78572670
0
  • 15k
  • 3
  • 40
  • 70

There appears to be whole documentation page as well as a framework that does seems to allow us to run Cloud Functions locally for testing purposes.

https://cloud.google.com/functions/docs/running/overview

78609376
0

Supervisor in a Docker container

78609935
0
  • 1.9k
  • 1
  • 18
  • 33

I had a similar use-case few months ago. I used Cloud PubSub Topics to report any completed tasks. Cloud PubSub Topics had a simple BigQuery subscription, so all the data is transformed into BQ Tables records.

This worked well for my use-case where a few hundreds of jobs are running at a time & sending status updates. This setup worked in my local env as well.