Event augmentation

Event augmentation

Introduction

It makes sense to try to keep your data layer as simple and as granular as possible.

  • Simple; so it’s easy for your development team to implement
  • Granular; so that you’re reporting on data points at the level they are generated, you’re making it easier on yourself to group and format the data however you like rather than tying yourself into a fixed format, and you’re not adding technical debt and build time for your developers by asking them to concatenate values or otherwise alter things the platform and backend systems return

For example - like this:

dataLayer.push({
  "event": "test",
  "items": [
    {
      "id": "abc123",
      "sku": "xyz789",
      "brand": "loop_horizon",
      "category": "white_paper",
      "name": "data_layer_augmentation"
    }
  ]
})        

Rather than like this:

dataLayer.push({
  "event": "test",
  "items": [
    {
      "id_sku": "abc123_xyz789",
      "brand_category_name": "loop_horizon|white_paper|data_layer_augmentation",
      "name_upper_case": "DATA_LAYER_AUGMENTATION"
    }
  ]
})        

However, different vendor tools will need data in different formats and business stakeholders will undoubtedly want data formatting, concatenating and otherwise tweaking to meet their needs.

You can do all that in your tag manager of choice, and this article covers a particularly neat way of going about it that avoids race conditions by allowing you to augment the event object itself.

Problem statement

Having a simple granular data layer is great, but it can cause a headache when you need to format the data to meet different vendor and business needs.

For example: if you want to use the Search Discovery Product String Builder GUI extension in Adobe Launch, but you have nested arrays of data in the product object.

The two biggest pains are:

1 - Having to repeat the same code over and over to re-write the data layer to the format each tool and business case requires

2 - Race conditions caused by:

a) Data not processing in the right order:

  • Data is processed on trigger A
  • Vendor tag X fires on triggers A also
  • Vendor tag X completes before the data has processed, causing the processed data to be missed

b) Data not processing fast enough:

  • Data is processed on trigger A
  • Vendor tag X fires on triggers A also
  • You successfully order things so that the data processing job is started before vendor tag X fires, but because firing is asynchronous, vendor tag X completes before the data has processed, causing the processed data to be missed

c) New data appearing before your tag fires causing the wrong data to be collected and the data you wanted to be missed:

  • Data is processed on trigger A and trigger B
  • Vendor tag X fires on triggers A only
  • Before vendor tag X has completed, trigger B fires, overwriting the data on your global object, causing the wrong processed data to be collected

All three of these issues arise from abstraction - the processed data object you’re referencing is abstracted away from the event that triggered it (and that event’s associated meta data). That is to say: trigger A fires with associated meta data. As the data is processed to a new object - object Y… it has been abstracted away from the trigger and associated meta data.

This opens up the possibility of timing issues causing object Y to not contain the processed data (the code to populate object Y hasn’t run fast enough) or to contain the wrong processed data (the code to populate object Y has run more than once in the time it's taken your tag code to reference it). By consuming data from the abstracted object Y rather than directly from the triggering event meta data, the various vendor tags can receive no information and / or the wrong information.

dataLayer.push({
  "event": "A",
  "items": [
    {
      "id": "abc123",
      "sku": "xyz789",
      "brand": "loop_horizon",
      "category": "white_paper",
      "name": "data_layer_augmentation"
    }
  ]
});

/*
Data processing triggered by custom event A in your TMS (listener for "A" applied to data layer updates)
*/
//returns the concatenated id and sku to a new variable on the window object
var concatData = dataLayer[dataLayer.length -1].items.map(function(key){return key.id + "_" + key.sku});

/*
Vendor tag X triggered by custom event A in your TMS (listener for "A" applied to data layer updates).
Assume this is triggered by TMS so ordering is async and not always guaranteed
*/
console.log(concatData);
/*
Returns "undefined" because concatData has not completed processing (unlikely with this exact example,
but more complex processing over a larger items array could see this issue).
Additionally, imagine - due to async running or issues with tag ordering - that console.log(concatData);
ran above the data processing script;
that would always lead to "undefined"!
*/        
dataLayer.push({
  "event": "A",
  "items": [
    {
      "id": "abc123",
      "sku": "xyz789",
      "brand": "loop_horizon",
      "category": "white_paper",
      "name": "data_layer_augmentation"
    }
  ]
});

dataLayer.push({
  "event": "B",
  "items": [
    {
      "id": "321qwe",
      "sku": "654zyx",
      "brand": "loop_horizon",
      "category": "beans",
      "name": "baked_beans"
    }
  ]
});

/*
Data processing triggered by custom event A or B in your TMS (listener for "A" or "B" applied to data layer updates)
*/
//returns the concatenated id and sku to a new variable on the window object
var concatData = dataLayer[dataLayer.length -1].items.map(function(key){return key.id + "_" + key.sku});

/*
Vendor tag X triggered by custom event A only in your TMS (listener for "A" applied to data layer updates).
Assume this is triggered by TMS so ordering is async and not always guaranteed
*/
console.log(concatData);
/*
Returns "321qwe_654zyx" because concatData has run twice befire vendor tag X fires.
This is the wrong data - it does not relate to event A at all
*/        

Solution

At Loop Horizon, we have tackled this problem many times over the years and the solution is to try to reference the trigger / event meta data directly where you can.

But how to square the circle of processing the trigger / event meta data while still referring directly to it, and not having to reference an abstracted object?

By augmenting the trigger / event meta data itself with your processed data; processing and pushing your augmented data directly back on to the triggering event meta data!

As well as our real-world experience with clients, we’ve specifically tested this in Adobe Launch (across four data layer handling methods) and Google Tag Manager (which is its own data layer handler); it works nicely on all. Some tools are more limiting that others in how easily you can utilise this technique, but they all present options for augmenting the trigger event meta data itself so you can reference your processed data directly from the event and avoid race conditions.

Adobe Launch

Adobe Launch utilises rules. Each rule requires an event to trigger it (what trigger fires the rule, and under what conditions). Then you can apply multiple actions (when the rule has tiggered, do X, Y and Z) to happen off the back of that trigger.

1 - Rule A:

a) What event triggers the rule to fire (e.g. a mouse click on button X)

b) Under what conditions (e.g. but only on the homepage)

c) What happens when the above event triggers and conditions are met (e.g. send some information to Google Analytics)

It has easily configurable rule ordering - for rules which utilise the same triggering mechanism - so you can ensure your scripts run in the right order. And it has a global setting that allows you to run rule actions synchronously (within the tag manager) to ensure that action A has completed running before Adobe Launch will run action B - in our experience, this works across rules which utilise the same triggering mechanism too, ensuring that the last action of rule X triggering from event A runs before the first action of rule Y triggering from event A.

Adobe Launch also offers a range of extensions that consume data layer push events and process them for use as rule trigger events, conditions and actions. Depending on how these extensions hook onto the data layer impacts how easy they are to use when attempting to augment

Data Layer Manager (Search Discovery)


Data Layer Manager looks to hook its event handler directly on to the data layer object. For our purposes, this is ideal - it means that when you augment the event object, your augmented data is pushed back on to the data layer itself. This means that not only can you access the augmented data in subsequent rules in Adobe Launch, but it makes the augmented data available to tools outside Adobe Launch. It also makes debugging a lot easier.

Creating the central function

Create a central function that will augment the event object and run it on page load as early as possible (or if you’re feeling daring, build a local Adobe Launch extension):

//Here's a dummy function you could create to augment your data
function augmentEventObject(eventObject){
  /*----For data layer manager eventObject is "event.detail".
  We use eventObject || {} in case it is undefined for whatever reason to avoid JS errors----*/
  var ev = eventObject || {},
      /*----Augment script - yours would be different, I'm just going to push some dummy data onto the event
      Maybe you'd do a map function on your product items array, augmenting data for a range of product objects----*/
      dummyData = {
        'dummy1': 'some dummy data',
        'dummy2': 'some more dummy data',
        'dummy3': 'additional dummy data',
        'dummy4': 'where will it end?',
        'dummy5': 'ah ok, that is all of it'
      }
  /*----This is the key to the whole procedure
  Once you've processed your data, assign it back to the event object the function received----*/
  Object.assign(eventObject, dummyData);
}        

Augmenting the event object - Data Layer Manager

Note: the internal Adobe Launch event object for Data Layer Manager is event.detail

Once your function is created, you can then reference it in your rules that trigger from your data layer events. In this example, my data layer event will be “test_event”:

augmentEventObject(event.detail); //pass your function the event object
_satellite.logger.info(event.detail); //log the event object itself to confirm that it has worked        

And now to push the “test_event” on to the data layer, and review the outcome - the data gets pushed onto both the internal event object in Adobe Launch, and the data layer object:

And we can even demonstrate that we can pick up the data on a later rule by running a second test event, and ordering it after the first.

Here’s the trigger for rule 1:

And here’s the trigger for rule 2:

And a simple logger to the console to prove it works:

_satellite.logger.info('test rule 2 - run just before pushing the event object');
_satellite.logger.info(event.detail); //log the event object itself to confirm that it has worked        

Adobe Client Data Layer (Adobe)

Unlike the Data Layer Manager extension, Adobe Client Data Layer (Adobe) appears to abstract the event handler away from the the data layer object.

This has it’s advantages - for example, if you want to use multiple mechanisms for processing the data layer (though it alters the standard data layer object in other ways that can render it unusable by other tools, so be aware of this if you’re trying to use this data layer handler in conjunction with Google products such as gTag, for example).

But this isn’t the ideal scenario for our needs. It means that when you augment the event object, your augmented data is not pushed back on to the data layer itself. Specifically for this data layer handling extension, you can access the augmented data with another action within the same rule. Outside the rule you cannot, so you’ll need to do one of the following:

  • Re-process the augmented object on to the data layer as a new event and then trigger your follow on rules from this new event
  • Send a new custom event or direct call event containing your augmented object as meta data (this reduces visibility of the augmented data, making the debug process more difficult, but it’s quite a neat solution)
  • Running the augmentation script as a pre-action for every rule you want to ingest the augmented data for (don’t do this one by the way - it repeats code unnecessarily and introduces technical debt by meaning you have to maintain the same code in multiple locations… it’s a sure fire way to guarantee that this process breaks or becomes intermittently out of date at some point in the future)

Anyway, on to the demonstration.

The process to create a central function that will augment the event object is identical to Data Layer Manager explainer, so I won’t repeat that here - I’ll start from augmenting the event object.

Additionally, I’ll jump straight into the “two rule” approach demonstrated above so you can see how the augmented event object is only available within the first rule and is not available for subsequent rules.

Augmenting the event object - Adobe Client Data Layer

Note: the internal Adobe Launch event object for Adobe Client Data Layer is event.message

Once your function is created, you can then reference it in your rules that trigger from your data layer events. In this example, my data layer event will be “test_event”:

augmentEventObject(event.message); //pass your function the event object
_satellite.logger.info(event.message); //log the event object itself to confirm that it has worked        

And now to push the “test_event” on to the data layer, and review the outcome - the data gets pushed onto the internal event object in Adobe Launch but only for the rule in question. When you get to rule 2 in the chain, the augmented data is not present:

Here’s the trigger for rule 2, for reference:

And here’s the code to see if the augmented data is maintained from rule to rule:

_satellite.logger.info('test rule 2 - run just before pushing the event object');
_satellite.logger.info(event.message); //log the event object itself to confirm that it has worked        

Now let’s see what happens in the dev console:

Demonstrating a work around

I mentioned that you could re-process the augmented object on to the data layer as a new event and then trigger your follow on rules from this new event, so let’s demonstrate that.

Update the action in rule 1 to this:

augmentEventObject(event.message); //pass your function the event object
_satellite.logger.info(event.message); //log the event object itself to confirm that it has worked
event.message.event = 'test_event2'; // update the event name so you're not creating an endless loop
dataLayer.push(event.message); //re-push the data to the data layer as a new event        

And now you can update your rule 2 to trigger off your new data layer event (in this example - “test_event2”):

You can update your “test_event2” trigger to fire in order position 1 now if you like (as in the image) as you’ve expressly daisy-chained the two rules to each other by making rule 1 create test_event2 for firing rule 2, so don’t need to specify load ordering to ensure these rules fire in the right order (note: you'll still potentially need load ordering of rules firing off the same event).

And now let’s see what we get in the developer console having made these updates:

And let’s have a look at the data layer so you can see this explicitly being pushed as a new event:

Note: you’re going to want to test this approach extensively to ensure it does not interfere with any of your other functionality by re-naming the event object within Launch; this is just an example of a possible solution.

Google Data Layer (Adobe)

Again, unlike the Data Layer Manager extension, Google Data Layer (Adobe) appears to abstract the event handler away from the the data layer object.

This means that when you augment the event object, your augmented data is not pushed back on to the data layer itself, unlike Adobe Client Data Layer (Adobe), you can access the augmented event in subsequent rules - it appears that it updates the event object globally in Adobe Launch

We should be pretty familiar with the process from the previous examples, so on to the demonstration again!

The process to create a central function that will augment the event object is identical to Data Layer Manager explainer, so I won’t repeat that here - I’ll start from augmenting the event object.

Additionally, I’ll jump straight into the “two rule” approach demonstrated above so you can see how the augmented event object is available globally but does not get pushed back to the data layer.

Augmenting the event object - Google Data Layer

Note: the internal Adobe Launch event object for Google Data Layer is event.event.eventModel

Separately, it’s pretty interesting to explore the event.event object anyway, as this extension processes the data layer in a few interesting ways.

Once your function is created, you can then reference it in your rules that trigger from your data layer events. In this example, my data layer event will be “test_event”:

augmentEventObject(event.event.eventModel);
_satellite.logger.info(event.event.eventModel); //log the event object itself to confirm that it has worked        

And now to push the “test_event” on to the data layer, and review the outcome - the data gets pushed onto the internal event object in Adobe Launch for all rules, so when you get to rule 2 in the chain, the augmented data is still present:

Here’s the trigger for rule 2, for reference:

Note: I’ve referenced the original “test_event” here for rule 2, and set the ordering to 2 again because, unlike with the Adobe Client Data Layer extension, we don’t have to daisy-chain rules together or apply the workarounds - the augmented data is available in subsequent rules with this extension.

So… any subsequent rules that want to access the augmented data have to be ordered to fire after the data augmentation rule / script has run.

And heres' the code to see if the data is augmented rule to rule:

_satellite.logger.info('test rule 2 - run just before pushing the event object');
_satellite.logger.info(event.event.eventModel); //log the event object itself to confirm that it has worked        

Now let’s see what happens in the dev console:

But if you look directly at the data layer, you see that it does not contain the augmented data - only the event object within Adobe Launch:


Matt Bentley

Head of Data Architecture & Analytics at Loop Horizon

9 个月

Got a really good question on my other post about this from Jared S. - so I thought I’d migrate it over to here. “Is there a reason you do this in the client?” It’s a really good point; server-side data enrichment and augmentation would be an ideal approach. In a perfect world, we'd perform the data manipulation server-side. But not all clients are ready yet and / or don't have budget or backing to make the jump; not all server-side platforms have the data manipulation capability required (e.g. some I've used have limited server-side array function capability) and for those that do - client-wise - new platforms would need budget and take significant time to onboard; and not all vendors can integrate with a server-side solution just yet (yet clients rely on them for revenue). I like to think of it as bringing server-side methodology / capability into the client, to act as a bridge between the old-world of data manipulation on the client, and the new-world of data manipulation server-side.

要查看或添加评论,请登录

Matt Bentley的更多文章

  • Improving cookie persistence (using a reverse proxy on GCP) - AKA: what I do in my spare time

    Improving cookie persistence (using a reverse proxy on GCP) - AKA: what I do in my spare time

    That title has some words in it. Even I'm not sure they form a coherent sentence, but it was the best I could do for…

    12 条评论
  • Integrating cookie management with tag management platforms

    Integrating cookie management with tag management platforms

    A few people have spoken to me recently about issues with race conditions when trying to integrate their cookie…

    4 条评论
  • My favourite bits of JS.

    My favourite bits of JS.

    You know when you sidle up to someone in the street or accost them at a party, and ask: "What's your favourite bit of…

    17 条评论
  • Get the most out of GA4: 6 Things to Do Now You’ve Migrated

    Get the most out of GA4: 6 Things to Do Now You’ve Migrated

    Congratulations on successfully migrating to Google Analytics 4 (GA4)! Now that you’ve made the transition, it’s time…

    2 条评论
  • Testing site speed

    Testing site speed

    We all know there’s an important correlation between site speed and important KPIs like conversion and customer…

    2 条评论
  • Data Layers

    Data Layers

    At Loop Horizon, we talk about drawing a direct line between customer intelligence and customer experience - e.g.

    3 条评论
  • Your baby ain't special...

    Your baby ain't special...

    I’ve just had a baby, and I must admit, if someone said that to me, I’d be pretty mad. But I’ve got to face facts; soon…

    4 条评论
  • Anyone for real-time data?

    Anyone for real-time data?

    I've been to a couple of conferences recently where there's been a lot of chat – both on and off stage – about…

    16 条评论
  • Optimisation vs Personalisation

    Optimisation vs Personalisation

    At Sky I’ve been lucky enough to have been part of a fantastic optimisation team AND played a major role in delivering…

    12 条评论
  • Why bother collecting clickstream data?

    Why bother collecting clickstream data?

    Why bother collecting clickstream data? Good question. To answer a question with another question: What are you going…

    10 条评论

社区洞察

其他会员也浏览了