Go modules with App Engine Flexible

For a recent project I spun up a Google App Engine (GAE) instance with Go. That was pretty straight forward. Thanks to Go 1.11 native module support, the Go project can be located anywhere on your filesystem.

1
2
3
4
5
$ tree server
server/
├── go.mod
├── go.sum
└── main.go

The deployment is then just a matter of:

1
2
3
4
5
# .gitlab-ci.yml
deploy_app_engine:
  image: google/cloud-sdk:270.0.0
  script:
    - gcloud --quiet app deploy server/app.yaml

With the App Engine configuration:

1
2
# app.yaml
runtime: go112

See this post from Dennis Allund and this post from Yogest Lakhotia for more details about the continuous deployment.

Great! Simple!

But then I had to change to GAE Flexible environment (partially because we wanted to use Google Cloud Endpoints to authenticate our users from Firebase and the Cloud Endpoints support for GAE Standard is in beta only work with App Engine Flexible Environment). To change from GAE Standard to Flexible you’d think we just need to extend the app.yaml like this:

1
2
3
4
5
# app.yaml
runtime: go1.12 # very important, the runtime name changed!
env: flexible
manual_scaling:
  instances: 1

But if you then try to run your pipeline, you will get errors like the following:

1
2
3
4
5
6
2019/11/17 12:26:42 staging for go1.12
main-package: server
2019/11/17 12:26:42 Staging Flex app: failed analyzing /builds/gitlab/my-project/server: cannot find package "github.com/gorilla/mux" in any of:
    ($GOROOT not set)
    /builds/gitlab/my-project/server/src/github.com/gorilla/mux (from $GOPATH)
GOPATH: /builds/gitlab/my-project/server/

When $GOPATH is not set it will simply default to “~/go”, but the issue persists.

App Engine Flexible is not aware of Go modules and always looks for the packages in $GOPATH. This StackOverflow post confirms that GAE Flexible does not support Go modules and the official documentation for GAE: Using Go libraries only mentions that you can fetch Go modules with go get. Well, thanks a lot. Especially because this step is not actually possible in the cloud-sdk Docker container because it does not have Go installed.

I came up with three possible workarounds for this issue. All of them fetch the Go modules before sending the source code over to GAE which in turns uses Cloud Builder to compile the Go binary.

1. Vendor and Commit Go modules into your repository

Some Go projects already implement this approach to be in total control of their dependencies and being able to build projects offline. Download all project dependencies into the vendor directory and directly commit them to the Git repository. This approach allows Cloud Builder to compile the binary because the source repository includes all dependencies.

1
2
3
4
cd server/
go mod vendor
git add vendor/
git commit -m 'Vendor all Go dependencies'

However, depending on your project as well as your workflow and philosophy, you may not be fond of this approach because it drastically increases the size of your repository. Keep reading on.

2. Install Go and fetch modules in Docker container

The second approach is installing Go in the cloud-sdk container, then downloading the dependencies and finally building the project. Since the container is Debian-based, we can simply install it with apt:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
# .gitlab-ci.yaml
deploy_app_engine:
  stage: deploy
  image: google/cloud-sdk:270.0.0
  before_script:
    - apt update -qq && apt install -y -q golang-go
  script:
    - cd server/
    - go mod vendor
    - gcloud --quiet app deploy app.yaml

This approach definitely works, but it installs the Go tools in the container every time. An alternative approach would be to create your own version of the image based on google/cloud-sdk with Go preinstalled, but then you still need to maintain and keep it up to date.

3. Use Build Artifacts to retrieve modules

If your build server supports copying files between jobs (like job artifacts in Gitlab-CI), you can reuse the dependencies you downloaded in a previous build step by just copying them over to the deploy step (which should logically come after you built and tested the application).

An example pipeline may look like this:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
# .gitlab-ci.yaml
stages:
  - build
  - deploy

build_server:
  stage: build
  image: golang:1.12
  script:
    - cd server/
    - go mod vendor
    - go vet ./...
    - go test ./...
    - go build .
  artifacts:
    paths:
      - server/vendor/

deploy_app_engine:
  stage: deploy
  image: google/cloud-sdk:270.0.0
  script:
    - gcloud --quiet app deploy server/app.yaml
  ## artifacts are automatically copied to server/vendor/

In my opinion, this solution is the most elegant one if you do not want to commit the dependencies into your repository, because it is the most efficient one: the dependencies are only downloaded once per build. Whether you can use it depends on your build system supporting copying artifacts from one build step to another one.


If you finally manage to deploy somehow, you still need to wait an eternity for Google Cloud to actually deploy your service:

1
2
3
4
Updating service [default] (this may take several minutes)...
...........................................................
(many dots later)
....done.

In my case this amounts to a whopping 8 minutes (every time)! Who knows what they’re doing up there in the cloud!

As you can probably tell from this post, I’m not exactly happy about how Google is treating its own programming language. The support in App Engine Flexible environment seems half-baked. Go modules were officially introduced in Go 1.11 (August 2018) and more than one year later (December 2019) they are still not supported by GAE Flexible.