How We Built GraphQL Subscriptions with Apollo
Michael Paris
At Scaphold, we use Apollo both to enrich our frontend React application with data as well as to build snappy backend GraphQL APIs. It was important for us to provide powerful, real-time features to every one of the applications on Scaphold, and Apollo’s new subscription protocol allowed us to build these features quicker than we could have hoped. This post is a high-level overview of how we leverage Apollo to rapidly build features at Scaphold.io!
We’ve recently published an in-depth tutorial on how to start using Apollo Client and Scaphold.io to rapidly build realtime applications of your own! The tutorial walks you through how you can create a real-time chat application we call Slackr. Check it out!
Why GraphQL Subscriptions
Subscriptions offer a clean and efficient way to get pushed updates in realtime. They act in parallel to mutations. Just like how mutations describe the set of actions you can take to change your data, subscriptions define the set of events that you can subscribe to when data changes. In fact, you can think of subscriptions as a way to react to mutations that are performed else where.
Being able to subscribe to changing data has become increasingly important in modern applications. Why poll for data with expensive HTTP requests when we could just as easily make our app realtime? GraphQL subscriptions to the rescue! GraphQL subscriptions give us all the benefits of GraphQL in real-time.
A real-world example
At Scaphold, we manage a lot of infrastructure to make sure our customers’ APIs stay available and performant. One of our core features is a graphical schema designer that allows you to easily define the GraphQL schema that will come to define your Scaphold API. What you don’t see when playing around with our schema designer, however, is a complex migration system that makes sure your API’s database is always up to date and in adherence with your schema. When you make a migration to your schema, the machine that fields the migration request will make the necessary changes to your database so that your API stays in sync.
However, a problem becomes evident when you consider that we run a highly available cluster of web servers to make sure your application doesn’t go down. To keep our latencies low, we cache a version of your schema on the server so we don’t have to recreate them on every request. How then can we make sure that out-dated versions of your schema are invalidated on every server where it’s cached?
Enter subscriptions! We use the same infrastructure that powers the subscriptions in your client APIs to power systems in our management APIs. For example, our web servers subscribe to all schema migration events and make sure to invalidate and rebuild schemas as soon as they are changed. This is how our frontend application is always able to pull the most recent version of your schema, even though it may be communicating with any one of our many servers.
The subscription query might look something like this:
subscription SubscribeToSchemaMigration { subscribeToMigrateSchema { # Use this id to know which schema to invalidate. id } }
Implementing Subscriptions on the Server
Apollo does a great job simplifying the logic needed to implement subscriptions on the server. In practice, subscriptions are essentially an intelligent wrapper around the pubsub protocol. In a pubsub system, any number of subscribers can ‘subscribe’ to a channel and any number of publishers can ‘publish’ messages to those channels. When a message is published to a channel, every subscriber that is subscribed to that channel will receive the message.
Apollo exposes a SubscriptionManager class that abstracts away a lot of the hardships of maintaining these subscriptions. Let’s take a look at how this helps us think about subscriptions.
This is what the SubscriptionManager constructor looks like:
class SubscriptionManager { constructor(options: { schema: GraphQLSchema, setupFunctions: SetupFunctions, pubsub: PubSubEngine }){ ... } }
As you can see, a SubscriptionManager takes a GraphQLSchema, a mysterious SetupFunctions object, and a PubSubEngine. One of the nice things about the SubscriptionManager is that it makes no assumptions about your PubSubEngine. As long as your engine has the ‘publish’ and ‘subscribe’ methods, it will work! So that’s the PubSubEngine, the GraphQLSchema is any GraphQL schema instance, but what about SetupFunctions?
Subscription Setup Functions
This is the key abstraction that makes our lives a lot simpler. To make sense of it, lets take a look at the SetupFunctions type definition:
interface SetupFunctions { [subscriptionName: string]: SetupFunction; }interface SetupFunction { (options: SubscriptionOptions, args: {[key: string]: any}, subscriptionName: string): TriggerMap; }interface TriggerMap { [triggerName: string]: TriggerConfig; }interface TriggerConfig { channelOptions?: Object; filter?: Function; }
First off, types in JavaScript/TypeScript are awesome. Types make it much easier to reason about and understand our code. If you aren’t already using them I highly recommend checking out TypeScript or Flow.
Back to subscriptions. As you can see, the SetupFunctions object is a mapping from a subscriptionName to a function that returns a TriggerMap, which is a mapping from a triggerName to a TriggerConfig, which itselfdefines two optional properties: channelOptions and filter. Wow, that was a mouthful. The channelOptions object allows us to dynamically add context to channels (See: a good write up about channelOptions), and the filter function allows you to decide if a given object that was pushed via a trigger should fire a subscription query or not.
Note: A triggerName is synonymous with the PubSub channel names that fire a subscription.
A Full Example
Let’s take a simple schema and go over how to setup a SubscriptionManager.
# schema.graphql type Schema { query: ... mutation: ... subscription: Subscription }type Subscription { subscribeToNewPosts(filter: PostFilter): Post }type Post { title: String! content: String }type StringFilter { eq: String lt: String gt: String matches: String }type PostSubscriptionFilter { title: StringFilter content: StringFilter }
While we’re at it let’s create the actual GraphQLSchema object from our graphql doc. makeExecutableSchema
takes a graphql document string and a resolver map and combines the two into a graphql-js
GraphQLSchema.
import schemaGql from './schema.graphql'; import { makeExecutableSchema } from 'graphql-tools';const resolverMap = { Mutation: ..., Query: ..., Subscription: { subscribeToNewPosts(newPost, args, context, info) { return newPost; }, }, };const schema = makeExecutableSchema({ typeDefs: schemaGql, resolvers: resolverMap, });
Our schema implements a single subscription: subscribeToNewPosts
. The subscribeToNewPost
subscription takes a filter argument and will return a stream of new Post data that satisfies the filter. For example we might issue a query like this:
subscription SubscribeToNewPosts( $postFilter: PostSubscriptionFilter ) { subscribeToNewPosts(filter: $postFilter) { title content } }
With the variables:
{ "postFilter": { "title": { "matches": ".*GraphQL.*" } } }
This subscription will get pushed data every time a new Post is created with a title that matches the regular expression /.*GraphQL.*/
. Now that we have the API defined, let’s take a look at the SetupFunctions for our schema.
The Setup Functions
To implement our schema, we would pass the SubscriptionManager a SetupFunctions object like this:
/* A simple example. The pubsub implementation is based on nodejs EventEmitter's and thus will only work if your whole app runs in a single process. In production you would likely use something like this excellent redis implementation: https://github.com/davidyaha/graphql-redis-subscriptions. */ import { PubSub, SubscriptionManager } from 'graphql-subscriptions'; import schema from './schema';const pubsub = new PubSub();const ourSetupFunctions = { // The name of the subscription in our schema subscribeToNewPosts: (options, { filter }, subName) => ({ // A pubsub channel that fires the // 'subscribeToNewPosts' subscription ['mutation.createPost']: { filter: post => { // We can do any filtering we want here. // Returning true will push data to the client. return valueSatisfiesFilter(filter, post); } }, }), };const subscriptionManager = new SubscriptionManager({ schema: schema, pubsub: pubsub, setupFunctions: ourSetupFunctions, });
Notice how our setup function is structured. The first level keys are the names of our subscriptions and the second level keys are the names of any channels that should activate this subscription. In our example, the only channel that fires the subscribeToNewPosts
subscription is mutation.createPost
. This was purely for simplicity as you can have more if you want.
For example, Scaphold allows you to subscribe to multiple events with a single subscription.
subscription SubscribeToNewAndUpdatedPosts( $postFilter: PostSubscriptionFilter ) { subscribeToPosts( mutations: [createPost, updatePost], filter: $postFilter ) { mutation value { title content } } }
The above subscription behaves similarly to before, except it will be fired whenever a post is created OR updated. To enable this type of functionality, you would need setup functions that look like this:
const ourSetupFunctions = { subscribeToPosts: (options, { filter }, subName) => ({ // The channel for create mutations mutation.createPost: { filter: post => { return valueSatisfiesFilter(filter, post); } }, // The channel for update mutations mutation.updatePost: { filter: post => { return valueSatisfiesFilter(filter, post); } }, }), };
In this example, the subscribeToPost subscription would be triggered by both the updatePost and createPost mutations. Having multiple triggers per subscription is a really nice feature that allows you to create more interesting subscriptions.
Great, that wraps most of the logic necessary to implement subscriptions on the server. Now lets wrap up by creating our server and connecting the pieces.
Finishing Up
Now that we have our SubscriptionManager, all we need to do is connect it to an HTTP server so our API can field requests from our client apps.
import { createServer } from 'http'; import { Server } from 'subscriptions-transport-ws'; import mySubcriptionManager from './subscriptionManager';const WS_PORT = 3001; const httpServer = createServer((request, response) => { response.writeHead(404); response.end(); }); httpServer.listen(WS_PORT, () => console.log( `Websocket Server is now running on http://localhost:${WS_PORT}` )); const server = new Server({ subscriptionManager: mySubcriptionManager }, httpServer);
And that does it! We now have a real-time GraphQL server!
Take Aways
Our first takeaway is that GraphQL subscriptions are awesome. The combination of GraphQL and real-time data allows us to build more interesting applications faster than ever. The second take-away is that Apollo tools can greatly improve the experience of developing GraphQL applications. I hope this article helped give you a better understanding of what it takes to implement subscriptions on your own servers.
That being said, if you want to start building real-time, GraphQL applications without needing to write your own API or manage your own infrastructure, take a look at Scaphold.io! In a few minutes, you can have a real-time, GraphQL API of your own design deployed and ready to power your next app. The schema designer will walk you through defining your data model and you can then immediately start issuing subscription queries directly from GraphiQL. The best part: it’s free!
Thanks for reading! If you’re still interested take a look at our front-end GraphQL Subscriptions tutorial and follow along as we build Slackr, a real-time chat app.