We are using the gorilla mux framework to handle web requests and I imagine it runs automatically on all cpu cores. Would using go routines in this case be beneficial for cpu intensive processes such as looping over large objects?
I want to automatically run on all cpu cores.
You guessed wrong. Kind of.
Starting with Go 1.5, Go will use all cores by running go routines on different cores. But if you don’t use go routines, there is no way to take advantage of it.
Would using go routines in this case be beneficial for cpu-intensive processes, such as looping over large objects?
Yes. But you are asking the wrong question.
You don’t use Go routines to take advantage of different CPU cores in the first place (although this is also a benefit). You use Go routines to prevent your program from blocking when performing operations that take a while.
For web applications, most requests are usually not CPU intensive. But they usually spend a lot of time (in computer terms) waiting for something to happen. They wait for DNS on the requested hostname Lookup, wait for the database to look up user credentials to establish a session, wait for the database to store or return a row to generate an HTTP response, etc.
Without convention, while doing these things, your server won’t be able to do anything else. So if a typical HTTP request takes 1 second to look up DNS, verify the authorization COOKIE, look up the results from the database and send the response , other HTTP clients cannot be provided at the same time.
http is that Gorilla (and almost every other web framework for Go) uses packages that already use Go routines to handle requests. So you already have Use (at least) one Go routine in each HTTP request.
Whether using additional go routines makes sense depends on your application design, rather than “using more CPU cores”.
If you are doing multiple things to serve requests that can be done in parallel, use go routines. Say you need to execute 3 DB queries, you can execute each query in a Go routine so that they run simultaneously. They might be running on different CPU cores, but that has absolutely nothing to do with actual performance, since each core is basically doing nothing but waiting for the database. But you’ll still get a performance gain. (For completeness, I’ll mention Goroutines are not the only way to run database queries in parallel – but they are the idiomatic Go way, and the easy way in Go).
If you have tasks that are run by HTTP requests but do not affect HTTP responses, you can run them in Go routines so that your responses can come back faster. Logging is a common example. You may want to log HTTP request, but if the logger is slow, you may not want to wait for the recording to complete before sending the HTTP response. So do this in daily work. The logger can spend a lot of time after the response is received by the client time and done. Another example is sending an email – please do not wait before replying to the HTTP client before sending the email.