Developing GCP Cloud Functions locally with Typescript

October 12, 2020

In this post I'll show you how I develop Cloud Functions locally using Typescript! I assume you already have some familiarity with Typescript, so we'll dive straight in with getting our project setup.

For this we're going to use Google's typescript style guide tool, gts.

gts provides us with an no fuss, opinionated, TypeScript setup. It includes a style-guide with linting and auto-fixing. For me this just gets rid of the pain of configuring new projects! gts is extendable, so if you disagree with the style guide then you're more than welcome to customise away to suit your team / project.
mkdir my-cf-project
cd my-cf-project
# use --yarn if you want to use yarn as your package manager, omit for npm.
npx gts init -y --yarn

If you run a quick ls of the current directory you'll see that gts has setup your TS project. Excellent!

Note: You can use the yarn lint and yarn fix commands to lint and automatically fix most formatting problems in your code. You can also configure your IDE to do this too.

Next up, we need a way of running Cloud Functions locally. For this we can use Google's Functions Framework. Over on their github you'll see various run times for NodeJS, PHP, Python and more. For us, we're using their NodeJS library.

We can add that as a dev dependency, along with the @types/express to include the express packages TypeScript definitions (without this you may get an error from tsc saying it couldn't find the type definitions).

yarn add -D @google-cloud/functions-framework @types/express
# or if using npm:
npm install --save-dev @google-cloud/functions-framework @types/express

In addition to this we'll need the following packages too: concurrently and nodemon. This will allow us to use nodemon to restart our functions framework CLI tool whenever files change, and continuously watch and recompile our TypeScript code.

yarn add -D concurrently nodemon
# or if using npm:
npm install --save-dev concurrently nodemon

Then we'll need to modify our package.json to add our new scripts:

  "scripts": {
    "start": "functions-framework --source=build/src/ --target=helloWorld",
    "watch": "concurrently \"tsc -w\" \"nodemon --watch ./build/ --exec npm run start\"",

So here our start script will define the build/src directory as the source for our function and then --target=helloWorld defines our function which will be executed when requests come into the functions framework (via http requests).

Next up, lets create replace src/index.ts with the following:

import type {HttpFunction} from '@google-cloud/functions-framework/build/src/functions';

export const helloWorld: HttpFunction = (req, res) => {
  res.send('Hello, World!');

With that done, we're ready to test! You can use the watch command to compile and run your application for development.

yarn watch
# or:
npm watch

And, with a little luck, you can open up http://localhost:8080/ and you'll see Hello, World! in your browser!

But wait, what about developing multiple functions!?

This solution is great if you want one repo per function, or to manage each function with it's own configuration per directory. Over time however this will become more cumbersome to manage. So let's refactor this a little and get it working for multiple functions.

It's worth saying that at the time of writing Google's Function Framework does not currently support developing multiple functions. Therefore, we need a little hack.

The best way I've found is to have a single entrypoint function, which then based on the URL path will route through to your individual functions. This works perfectly for any HTTP functions you have. I'll cover other functions in another post.

So with that in mind, rename your src/index.ts to src/helloWorld.ts and create a new src/index.ts

In the newly created src/index.ts you'll want to receive the request then route it through to your helloWorld.ts function. Like so:

import type {HttpFunction} from '@google-cloud/functions-framework/build/src/functions';
import {helloWorld} from './helloWorld';

const ROUTES = [
    pathRegex: /^\/helloWorld/,
    function: helloWorld,

export const index: HttpFunction = (req, res) => {
  const routes = ROUTES.filter(route => req.path.match(route.pathRegex));
  if (routes.length === 0) {
    throw new Error(
      'Unknown path, have you defined your function in src/index.ts?'

  const [entrypoint] = routes;
  return entrypoint.function(req, res);

export {helloWorld};
You may noticed the last line: export {helloWorld};, why am I doing this? Well GCP Cloud Functions don't allow you specify which file and which function to execute. By default it will look for either index.js or function.js. Therefore to deploy any of our functions they will need to be exported from our index.ts file too.

You'll then need to modify your package.json start script to point to your exported index function:

+   "start": "functions-framework --source=build/src/ --target=index",
-   "start": "functions-framework --source=build/src/ --target=helloWorld",

Now with this solution you can add a new object into the ROUTES constant any time you wish to add a new function to be triggered, along with a path regex to match. I also throw an error when no routes are matched with a handy message to help guide the developer as to what might be wrong.

It isn't perfect. But we can make additional modifications along the way if we need. For example, you may expect the request path (req.path) for your function to be empty (i.e. routing to the index / of the cloud function domain). That can be achieved by adding req.path.replace(entrypoint.pathRegex, ''); before you execute the function on line 20. So our new index.ts will look like this:

  const [entrypoint] = routes;
+ req.path.replace(entrypoint.pathRegex, '');
  return entrypoint.function(req, res);

What about testing?

For testing we'll use mocha, chai and chai-http. However, before we can install these dependencies we will need to make a small change to our tsconfig.json. Due to a dependency on chai-http that expects some browser based types/interfaces to be defined. So, under the compilerOptions object we'll add "lib": ["es2018", "dom"], and we also need to add "esModuleInterop": true, so that chai and chai-http can be imported using es6 import syntax.

  "extends": "./node_modules/gts/tsconfig-google.json",
  "compilerOptions": {
    "rootDir": ".",
    "outDir": "build",
+   "lib": ["es2018", "dom"],
+   "esModuleInterop": true,
  "include": [

Next up, let's add our dependencies for testing:

yarn add -D express ts-node chai chai-http mocha @types/chai @types/mocha

Next modify our package.json to include our test script:

-    "test": "echo \"Error: no test specified\" && exit 1",
+    "test": "mocha -r ts-node/register test/**/*.test.ts",

Then let's create a simple test for our helloWorld function:

mkdir test
touch test/helloWorld.test.ts

For our test/helloWorld.test.ts file:

import chai from 'chai';
import chaiHttp from 'chai-http';
import express from 'express';
import {helloWorld} from '../src/helloWorld';


const app = express();
app.all('/', helloWorld);

describe('helloWorld function', () => {
  describe('GET /', () => {
    it('should get return hello world', done => {
        .end((err, res) => {
          res.text.should.equal('Hello, World!');

Now you can run yarn test and see that our tests are working wonderfully!

Any other considerations?

This approach works well for small http based functions. There are a few other things you may want to think about for your project:

  • What about other events which trigger your cloud functions (PubSub events perhaps?)
  • Will your dependencies grow larger overtime? You may need to split functions up to keep dependency sizes down (however, don't run before you can walk.). Dependencies may have a negative impact on your function cold start times.
  • How does this impact your deployment strategy? You package the build/src directory along with your package.json and yarn.lock/package-lock.json


There you have it, a simple way to develop your HTTP based cloud functions locally using Typescript! Hopefully this has helped you setup a new project, or reconsider the way you're currently developing cloud functions.