graphql-engine

This note is in Control.Monad.Memoize. It is referenced at:

Tying the knot

GraphQL type definitions can be mutually recursive, and indeed, they quite often are! For example, two tables that reference one another will be represented by types such as the following:

type author {
  id: Int!
  name: String!
  articles: [article!]!
}

type article {
  id: Int!
  title: String!
  content: String!
  author: author!
}

This doesn’t cause any trouble if the schema is represented by a mapping from type names to type definitions, but the Parser abstraction is all about avoiding that kind of indirection to improve type safety — parsers refer to their sub-parsers directly. This presents two problems during schema generation:

  1. Schema generation needs to terminate in finite time, so we need to ensure we don’t try to eagerly construct an infinitely-large schema due to the mutually-recursive structure.

  2. To serve introspection queries, we do eventually need to construct a mapping from names to types (a TypeMap), so we need to be able to recursively walk the entire schema in finite time.

Solving point number 1 could be done with either laziness or sharing, but neither of those are enough to solve point number 2, which requires /observable/ sharing. We need to construct a Parser graph that contains enough information to detect cycles during traversal.

It may seem appealing to just use type names to detect cycles, which would allow us to get away with using laziness rather than true sharing. Unfortunately, that leads to two further problems:

So we’re forced to use sharing. But how do we do it? Somehow, we have to /tie the knot/ — we have to build a cyclic data structure — and some of the cycles may be quite large. Doing all this knot-tying by hand would be incredibly tricky, and it would require a lot of inversion of control to thread the shared parsers around.

To avoid contorting the program, we instead implement a form of memoization. The MonadMemoize class provides a mechanism to memoize a parser constructor function, which allows us to get sharing mostly for free. The memoization strategy also annotates cached parsers with a Unique that can be used to break cycles while traversing the graph, so we get observable sharing as well.