Proxies, markers, and observability
Photo by Agence Olloweb - https://unsplash.com/@olloweb

Proxies, markers, and observability

Using Javascript’s ES6 Proxy to ease the pain of instrumenting our AWS Lambda functions for observability in our serverless platform.

The thing about distributed functions

After working with AWS Lambda for about three years it’s hard not to think it’s the solution to all our architectures. It’s great! It gives you isolation, scalability, simplicity, and plays very well with event-driven architectures.

Plus a bunch of other benefits:

Lambda is a better abstraction from the good ole’ cgi-bin days.

One of the main challenges with functions as a service (and microservices) is figuring out what’s going on inside at a platform level: Distributed Systems can grow out of control easily, becoming hard to manage, and understand.

As the number of services and engineers in our serverless platform grow we started to face these problems. Looking for solutions I ran into AWS X-RayEnvoy, and IOPipe.

X-Ray is the de facto solution for debugging and analyzing distributed services from AWS. At the time it didn’t provide an SDK for NodeJS and that limited our adoption.

Envoy is the most widely adopted solution for containers. But I couldn’t figure a good abstraction that fits the serverless paradigm.

IOPipe provides tracing, alerting, and profiling. It integrates with lambda functions and has a plugin for the Serverless Framework.

That looked like a good fit for our needs. It provides a lot of features out of the box, and looks effortless at first sight. I added it to the project, and wrapped our functions' handlers with it:

const iopipeLib = require('@iopipe/iopipe')
const tracePlugin = require('@iopipe/trace')

const iopipe = iopipeLib({
  token: process.env.IOPIPE_TOKEN || '',
  enabled: Boolean(process.env.IOPIPE_TOKEN),
  plugins: [
    tracePlugin({
      autoHttp: true
    })
  ]
})

let counter = 0

module.exports.handler = iopipe(async (event, context) => {
  counter++

  return {
    statusCode: 200,
    body: JSON.stringify({ counter })
  }
})
A simple λ wrapped with IOPipe.

After that I invoked the function, and jumped to IOPipe’s Dashboard:

IOPipe’s Function View

You get a centralized view with information like number of function invocations, number of errors, memory and CPU usage, duration, cold start hinting, and aggregated log information.

But for meaningful information can that help us know what’s going on at a platform level you need to instrument the codebase: That means adding markers and labels before and after all the methods that you want to introspect:

label('courses')
mark.start('courses.list')
const courses = await Courses.list()
mark.end('courses.list')

...

mark.start('serializer.jsonapi')
const jsonapi = dataSerializer.serialize(courses)
mark.end('serializer.jsonapi')

Labels tag related functionality and behavior, helping us answer questions like which functions are using our GraphQL Client? and How much time in average is Session creation taking?

Markers add tracing and time spans to the behavior of the functions, helping us answer questions like what’s making this function slow? or which operations can be improved?

We instrument our model async methods, external services calls, data-layer calls (ORM, DB clients, SDK calls, etc), and data mapping methods. That provides enough information to peek into the platform:

Courses Listing λ — Tracing information

In the example we can now see that in a listing operation a third of the time is authenticating the request token, most of which is getting users’ session. Then, almost half of the execution time is getting the data out of the database.

After seeing the data we came up with two straightforward strategies to improve the performance of the function:

  • Adding a caching layer for the User’s session that only expires when they interact with an API that changes their state: Things like favoriting a Course and following a Topic or Lecturer.
  • Adding a time to live (TTL) to keep Courses listings in memory for a reasonable amount of time; avoiding trips to the database.

Instrumenting functions opens up a window to look into the platform, unveiling behavior that would be hard to find and debug otherwise.

But adding labels and markers all through the code gets tiring. It makes the code look contrived, and gives you the feeling that half your code is just instrumentation. ?? ?? ?? ??

Last week, Felipe Guizar and I were preparing our weekly Birds of a Feather Architecture session. He was doing a talk on Frontend Reactive Architectures. We started by talking about Vue using setters and getters to implement reactivity. He prototyped a Store with event listeners, and used setters to notify the store of value changes. Then we moved onto implementing the same reactivity but with ES6 Proxy.

That got me thinking about Observability again…

Lyft uses Envoy Proxy to automagically instrument all their microservices. Envoy is an out-of-process proxy used on service-oriented architectures. It handles distributed tracing, metrics, logging, load balancing, service discovery, circuit breaking and retries, TLS terminations, etc.

Everything is abstracted from the applications’ logic by proxying the network. All service traffic flows via Envoy in a consistent, platform-agnostic way:

Envoy Proxy — Service Communication

In the same sense Javascript’s Proxy gives us a way to peek into the behavior of our objects:

Person = function(name, age = 42) { 
  this.name = name, 
  this.age = age 
}

me = new Person('eduardo', 35)
Start with a simple Person object, and instantiate it.
const Observe = (AnObject) => {

  return new Proxy(AnObject, {
    get (target, prop) {

      const method = target[prop]

      if (typeof method === 'function') {
        console.log(`\`${method}\` was accessed`)
      } else {
        console.log(`\`${prop}\` was accessed, and has value of ${target[prop]}`)
      }
      return target[prop]
    },

    set (obj, prop, value) {
      const oldValue  = obj[prop]

      obj[prop] = value;

      console.log(`${prop} changed from \`${oldValue}\` to \`${value}\`.`)
      return true
    }
  })}
Create a Proxy Object that wraps all properties and method calls.

The Proxy object intercepts all operations of an object: property lookups, assignments, method calls, etc.

Anything that happens on the wrapped object can be “seen”:

ObservedPerson will log all properties when they’re accessed or changed.

I sprinkled some IOPipe magic into the proxy implementation creating a wrapper object that autogenerates labels and markers from the object methods:

  const Observe = (AnObject, Label = '') => {
  const name = Label || generateLabel(AnObject)

  if (DISABLE_TRACING) {
    return AnObject
  }

  return new Proxy(AnObject, {
    get(target, prop) {
      let label = ''if (name) {
        iopipe.label(name)
        label = `${name}-`
      }

      const method = target[prop]
      const isPromise = method instanceof Promiseconst isAsyncFunction = method[Symbol.toStringTag] === 'AsyncFunction'

      const marker = `${label}${Inflector.dasherize(Inflector.underscore(prop))}`

      if (isPromise) {
        iopipe.mark.start(marker)

        // return a new Promise that will stop the marker after the promise finishedreturn Promise.resolve(method).finally(() => { iopipe.mark.end(marker) })
      } else if (isAsyncFunction) {
        iopipe.mark.start(marker)

        // return a new async function that will await for the call and stop the markerreturn (async (...args) => {const result = await method(...args)
          iopipe.mark.end(marker)

          return result
        })
      }
      return target[prop]
    },
  })
}

And wrapped all of our exports with that object:

module.exports = Observe(CoursesModel)

The proxied object behaves exactly the same, but it will trap all async calls, adding a label with the pattern object-method, starting the marker before executing calls, and ending it afterwards. All this without changing the developer experience.

After removing tracing and labeling calls, the code looks clean. Looking sleek while keeping all its observability benefits.

const courses = await Courses.list()
const data    ?= dataSerializer.serialize(courses)

There are some areas where it’s still valuable to manually measure and mark our code, but for the most part this just works and it’s simple.

This approach gives us a consistent practice to reuse across the platform. I still like Envoy’s approach a lot more because it’s language-agnostic, nobody needs to think — or forget — about it and other additional features for free. But using ES6 Proxy to add tracing seamlessly feels like a step into the right direction for our serverless practice.

More resources ??

Thanks to Rafael Castro, Fernando AlvarezMatheus SampaioFelipe Guizar Diaz, and Carlos Alexis González Gómez for beta testing this article and helping publish a better release.


?This post was originally published in my Medium account. You can find the original version here.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了