Running Laravel on Google App Engine

I recently deployed GNR Comparo. It’s built with Laravel on Google App Engine. Here’s some notes about my experience deploying this stack.

I followed these instructions and worked. But to get things up and running there were then some conceptual things I needed to understand, which did not seem to be explained anywhere. Here’s what I worked out.

.gcloudignore

It was not immediately clear what files I should be ignoring (ie, not deploying). After some experimentation, I reached the following conclusions:

Google Cloud Build will run Composer for you, so you do not need to send your vendor folder up to Google App Engine.

However, it will not build your NPM jobs (in Laravel, typically npm run prod). I settled on running this on my localhost and deploying the built code in the /public folder. So you do not need to send the node_modules folder or your JS sources.

And then you can ignore the usual meta files that are also in your .gitignore - eg, on a Mac things like .DS_Store.

My resulting .gcloudignore file is here:

# Ignore this file, obviously:
.gcloudignore

# Git stuff:
.git
.gitignore
.gitattributes

# PHP Composer dependencies:
/vendor/

# NPM:
/node_modules/

# Source code
/resources/js
/resources/sass
/resources/img

# Mac things and other meta files:
.DS_Store

app.yaml

In the standard environment, app.yaml does a lot of the work of Laravel’s .env file, notably holding potentially private or secret environment variables such as APP_KEY. So it is both private and environment-specific, just like .env. You should not add it to a git repo. I’ve added a app.yaml.example file with a blank key, just like Laravel’s .env.example. The project README can then explain how to edit the data in this file (eg adding the app key). The app.yaml.example file is copied below.

Static Folders

I am deploying the results of npm run prod so I had to add some static paths for assets in the public folder.

The resulting app.yaml.example holding the above config and these static folder definitions looks like this:

runtime: php72

handlers:
  - url: /js
    static_dir: public/js
  - url: /css
    static_dir: public/css
  - url: /img
    static_dir: public/img
  - url: /fonts
    static_dir: public/fonts

env_variables:
  APP_KEY: base64:your_key_here
  APP_STORAGE: /tmp
  VIEW_COMPILED_PATH: /tmp
  SESSION_DRIVER: cookie
  LOG_CHANNEL: stackdriver

These paths could change depending on your webpack.mix.js script. Basically, any directory in public into which you place built assets will need a corresponding static path entry in app.yaml.

Versioning

I found the standard Laravel Mix versioning to work just fine with Google App Engine Standard Environment, without any extra config. This is added to webpack.mix.js:

if (mix.inProduction()) {
    mix.version();
}

and this in the relevant Blade file with my layout:

<script src="{{ mix('js/app.js') }}" defer></script>
<link href="{{ mix('css/app.css') }}" rel="stylesheet">

and then Laravel found the mix-manifest.json and served up the correct links.

Deployment

Deployment is then as simple as building the production assets and calling deploy:

npm run prod
gcloud app deploy

That said, I found to get a reliable deployment, I needed to replace gcloud app deploy with gcloud beta app deploy --no-cache. This ensured a fresh set of caches and recompiled views, etc.

Database and Migration

To set up Laravel with Cloud SQL, first create the Cloud SQL instance. Then add the following to the env_variables of your app.yaml. Replace myinstance with your instance name.

  DB_SOCKET: /cloudsql/myinstance:europe-west1:mysql
  DB_DATABASE: myinstance
  DB_USERNAME: root
  DB_PASSWORD: mypassword

This works really well, in that .env holds your localhost config, and app.yaml holds your deployed config. You do not need to manage config files for different environments. However, the live database password is stored in plaintext in your project root - more reason not to add app.yaml to a Git repo.

Then you will need to connect to the remote database to run jobs like php artisan migrate. I could not find a way to do this as part of an automated deployment in Google Cloud itself, so settled on this workaround to script it.

First, install Google Cloud Proxy and as described on that page create a service account and download a credentials key in JSON. Store this is your project root, but again do not add it to a git repo.

A it stands, if you run php artisan migrate it will pick up the config vars from .env and run the migration job on your local DB. To make it work with Cloud SQL, you neet to override these .env vars to point at the Cloud SQL instance and not the localhost. The following shell script, running in its own shell, will do this as a one-off without you needing to swap between different .env files.

In the following, replace myinstance with your instance name, and mypassword with your password.

#!/bin/zsh

PID=$(ps -o pid=,comm= | grep -m1 cloud_sql_proxy | cut -d' ' -f1)
echo "Proxy PID is $PID"
if [ -n "$PID" ]; then
	echo "Proxy is running.";
else
	echo "Starting proxy...";
	cloud_sql_proxy -instances=myinstance:europe-west1:mysql=tcp:3307 -credential_file=myinstance-nnnnnnnn.json &
fi

sleep 3
export DB_HOST=127.0.0.1 DB_PORT=3307 DB_DATABASE=myinstance DB_USERNAME=root DB_PASSWORD=mypassword
php artisan migrate

The flaw is that your production DB password is stored in plaintext. The solution I thought of would be to store this password in an Ansible vault and build an Ansible deployment system. This would create the app.yaml and the above DB migration script from templates, run gcloud app deploy and the DB migration script, then tidy up.